WO2014122804A1 - 画像処理装置、撮像装置、画像処理方法及びプログラム - Google Patents
画像処理装置、撮像装置、画像処理方法及びプログラム Download PDFInfo
- Publication number
- WO2014122804A1 WO2014122804A1 PCT/JP2013/062466 JP2013062466W WO2014122804A1 WO 2014122804 A1 WO2014122804 A1 WO 2014122804A1 JP 2013062466 W JP2013062466 W JP 2013062466W WO 2014122804 A1 WO2014122804 A1 WO 2014122804A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- point image
- image
- image restoration
- processing
- restoration processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 524
- 238000003672 processing method Methods 0.000 title claims description 4
- 238000000034 method Methods 0.000 claims abstract description 146
- 230000008569 process Effects 0.000 claims abstract description 130
- 238000010191 image analysis Methods 0.000 claims description 35
- 238000003384 imaging method Methods 0.000 claims description 34
- 238000004458 analytical method Methods 0.000 claims description 14
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000004886 process control Methods 0.000 abstract description 3
- 230000004075 alteration Effects 0.000 description 88
- 230000003287 optical effect Effects 0.000 description 51
- 230000006870 function Effects 0.000 description 45
- 238000003860 storage Methods 0.000 description 19
- 238000006243 chemical reaction Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 18
- 238000004891 communication Methods 0.000 description 17
- 238000012937 correction Methods 0.000 description 16
- 239000003086 colorant Substances 0.000 description 12
- 238000001514 detection method Methods 0.000 description 9
- 238000011084 recovery Methods 0.000 description 9
- 230000015556 catabolic process Effects 0.000 description 8
- 238000006731 degradation reaction Methods 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 229910052745 lead Inorganic materials 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920013655 poly(bisphenol-A sulfone) Polymers 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
- H04N25/615—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
- H04N25/6153—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF] for colour signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4015—Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Definitions
- the present invention is a technique related to image processing, and more particularly, to image processing by point image restoration processing based on a point spread function (PSF).
- PSF point spread function
- image degradation may occur due to various aberrations of an optical system (such as a photographing lens).
- Image degradation due to aberration can be expressed by a point spread function (PSF), and the image degradation can be removed by applying a restoration filter (inverse filter) of the point spread function to the image data.
- PSF point spread function
- Yes point image restoration process
- performing the point image restoration process may generate false colors or false signals, which may degrade the image quality.
- Patent Document 1 in a technique related to image restoration processing (which is considered to correspond to the point image restoration processing of the present application), image restoration processing is performed for a portion where false color is generated when image restoration processing is performed. There is no disclosure. Patent Document 1 discloses that it is preferable to perform the image restoration process before the demosaicing process (which is considered to correspond to the demosaic process of the present application).
- Patent Document 2 discloses that the recovery degree of the image recovery process can be continuously adjusted, and the occurrence of false colors can be suppressed by adjusting the recovery degree. However, it is also disclosed that chromatic aberration of magnification occurs as the degree of recovery decreases.
- the false color disclosed in Patent Document 1 is an image restoration filter that is optimal for the in-focus distance and is not optimal for the non-focus distance (which is considered to correspond to the recovery filter of the present application).
- image restoration processing is performed for an out-of-focus object, this occurs when there is a difference between the state of aberration that changes depending on the state of the optical system during imaging and the state of aberration that the image restoration filter corresponds to .
- the chromatic aberration and the false signal generated by demosaic processing in the present application will be described later.
- image restoration processing is performed in consideration of artifacts and false colors, but image restoration processing is not performed in consideration of false signals generated by chromatic aberration and demosaic processing.
- image degradation may occur due to false signals that are caused by chromatic aberration and demosaic processing.
- the artifact disclosed in Patent Document 2 is an artifact that occurs in a recovered image when there is a difference between the aberration characteristics that are actually captured and the aberration characteristics that image recovery assumes. It is.
- the false color disclosed in Patent Document 2 is generated in a recovered image when the recovery degree is different from that assumed for each RGB color component.
- An object of the present invention is to provide an image processing apparatus, an imaging apparatus, an image processing method, and a program that can obtain a clear image by performing point image restoration processing on image data.
- an image processing apparatus performs demosaic processing on mosaic image data output from an imaging device, and generates demosaic image data.
- the brightness system image data acquisition means for acquiring brightness system image data that is image data relating to brightness
- the brightness system image data acquired by the brightness system image data acquisition means On the other hand, acquired by a point image restoration processing execution unit that performs point image restoration processing, an information acquisition unit that acquires control information related to execution of the point image restoration process, based on shooting information related to shooting conditions of the subject, and an information acquisition unit
- Point image restoration processing control means for controlling the processing operation of the point image restoration processing execution means based on the control information obtained.
- the point image restoration process is executed in consideration of the chromatic aberration and the degree of generation of the false signal due to the demosaic process, so that an image with higher image quality can be obtained. Further, since the point image restoration process is performed on the luminance system image data, a clearer image can be obtained.
- the shooting information includes at least one of lens information used for shooting, an aperture value at the time of shooting, a focal length at the time of shooting, and a subject distance at the time of shooting.
- the image processing apparatus preferably determines whether the fake signal is mosaic image data emphasized when the point image restoration process is performed based on the mosaic image data or the demosaic image data, or the fake signal is generated when the point image restoration process is performed.
- Image analysis means for analyzing whether the demosaic image data is emphasized or not, and the information acquisition means acquires control information based on the photographing information and the analysis information obtained by the image analysis means.
- the point image restoration process can be executed in consideration of not only the shooting information but also the analysis information obtained by analyzing the shot image, and an image with higher image quality can be obtained.
- the image analysis means determines whether the image is mosaic image data in which a false signal is emphasized when a point image restoration process is performed by obtaining a magnitude of contrast based on mosaic image data or demosaic image data. Alternatively, it is analyzed whether or not the demosaic image data enhances the false signal when the point image restoration process is performed.
- the point image restoration processing control means determines whether or not the false signal is emphasized when the point image restoration processing is performed based on the control information, and the point image restoration processing control means When it is determined that the image is not emphasized, the point image restoration processing execution unit performs point image restoration processing on the luminance system image data, and the point image restoration processing control unit determines that the false signal is emphasized.
- the point image restoration processing by the point image restoration processing execution means is prohibited for the luminance system image data.
- the point image restoration processing control means determines whether or not the false signal is emphasized when the point image restoration processing is performed based on the control information, and the point image restoration processing control means
- the point image restoration processing execution unit performs point image restoration processing on the luminance system image data, and the point image restoration processing control unit determines that the false signal is emphasized.
- the false signal enhancement region which is the region where the false signal is emphasized, is specified, the point image restoration processing execution means prohibits the point image restoration processing for the false signal enhancement region, Perform image restoration processing.
- the point image restoration processing control means determines whether or not the false signal is emphasized when the point image restoration processing is performed based on the control information, and the point image restoration processing control means When it is determined that the image is not emphasized, the point image restoration processing execution unit performs point image restoration processing on the luminance system image data, and the point image restoration processing control unit determines that the false signal is emphasized. Specifies a false signal emphasis area that is an area where the false signal is emphasized, and performs point image restoration processing by the point image restoration processing execution means in areas other than the false signal enhancement area, Instead of the point image restoration process, the point image restoration process execution means performs a point image restoration process that is less effective than the point image restoration process.
- the point image restoration processing control means determines, based on the control information, whether or not a false signal is emphasized when the point image restoration processing is performed, and the point image restoration processing control means Is determined not to be emphasized, the point image restoration processing execution unit performs point image restoration processing on the luminance system image data, and the point image restoration processing control unit determines that the false signal is enhanced.
- the degree of false signal enhancement which is the degree of enhancement of the false signal
- the point image restoration processing execution means changes the strength of the point image restoration processing according to the degree of false signal enhancement and performs point image restoration processing. Let it be done.
- the false signal enhancement region specified by the point image restoration processing control means is a region having a high image height.
- the degree of false signal enhancement specified by the point image restoration processing control means increases in accordance with the height of the image height.
- the luminance system image data is preferably luminance data obtained based on color data in demosaic image data having the highest contribution rate for obtaining a luminance signal, or demosaic image data.
- the luminance system image data is preferably a value of the luminance signal Y in a color space represented by the luminance signal Y and the color difference signals Cb and Cr.
- the demosaic processing unit determines a correlation direction in a plurality of pixel signals constituting the mosaic image data based on the color data in the mosaic image data having the highest contribution rate for obtaining the luminance signal. Processing to include.
- the demosaic processing unit preferably includes an edge detection process in the mosaic image data based on the color data in the mosaic image data having the highest contribution rate for obtaining the luminance signal.
- an imaging apparatus includes a demosaic processing unit that performs demosaic processing on mosaic image data output from an imaging device and generates demosaic image data, and a demosaic.
- the luminance system image data acquisition means for acquiring luminance system image data that is image data relating to luminance, and the luminance system image data acquired by the luminance system image data acquisition means Acquired by a point image restoration processing execution unit that performs point image restoration processing, an information acquisition unit that acquires control information related to execution of the point image restoration processing, based on shooting information when the subject is shot, and an information acquisition unit
- Point image restoration processing control means for controlling the processing operation of the point image restoration processing execution means based on the control information obtained. Including the location.
- the point image restoration process is executed in consideration of the chromatic aberration and the degree of generation of the false signal due to the demosaic process, so that an image with higher image quality can be obtained. Further, since the point image restoration process is performed on the luminance system image data, a clearer image can be obtained.
- an image processing method includes a demosaic processing step of performing demosaic processing on mosaic image data output from an image sensor and generating demosaic image data; Based on the demosaic image data generated by the demosaic processing step, the brightness system image data acquisition step for acquiring brightness system image data related to brightness, and the brightness system image data acquired by the brightness system image data acquisition step A point image restoration processing execution step for performing image restoration processing, an information acquisition step for obtaining control information relating to execution of the point image restoration processing based on shooting information relating to shooting conditions of the subject, and control information obtained by the information acquisition step. Based on the point image restoration processing control step for controlling the processing operation of the point image restoration processing execution step. Including the step, the.
- the point image restoration process is executed in consideration of the chromatic aberration and the degree of generation of the false signal due to the demosaic process, so that an image with higher image quality can be obtained. Further, since the point image restoration process is performed on the luminance system image data, a clearer image can be obtained.
- a program includes a demosaic processing step for performing demosaic processing on mosaic image data output from an image sensor and generating demosaic image data; Based on the demosaic image data generated in the step, the luminance system image data acquisition step for acquiring luminance system image data related to the luminance, and the point image restoration for the luminance system image data acquired in the luminance system image data acquisition step Based on the point image restoration processing execution step for performing processing, the information acquisition step for obtaining control information relating to execution of the point image restoration processing based on the photographing information relating to the photographing condition of the subject, and the control information obtained by the information obtaining step , Point image restoration processing control that controls the processing operation of the point image restoration processing execution step To execute the steps, to the computer.
- a computer-readable recording medium includes a demosaic processor that performs demosaicing on mosaic image data output from an imaging device when an instruction stored in the recording medium is read by the processor.
- a demosaic processing step for performing processing and generating demosaic image data; a luminance system image data acquisition step for acquiring luminance system image data relating to luminance based on the demosaic image data generated by the demosaic processing step; and luminance system image data
- the point image restoration processing execution step for performing point image restoration processing on the luminance system image data acquired in the acquisition step, and acquisition of control information related to execution of the point image restoration processing based on the shooting information on the shooting conditions of the subject Information acquisition step and control obtained by the information acquisition step Based on the distribution, to perform the point image restoration processing control step of controlling the processing operation of the point image restoration processing execution step.
- the point image restoration process is executed in consideration of the chromatic aberration and the degree of generation of the false signal due to the demosaic process, so that an image with higher image quality can be obtained. Further, since the point image restoration process is performed on the luminance system image data, a clearer image can be obtained.
- the present invention it is possible to obtain an image with better image quality by executing the point image restoration process in consideration of the chromatic aberration and the generation degree of the false signal due to the demosaic process. Further, according to the present invention, since the point image restoration process is performed on the luminance system image data, a clearer image can be obtained.
- FIG. 19 is a diagram showing a state in which the basic array pattern shown in FIG. 18 is divided into 3 ⁇ 3 pixels. It is a block diagram which shows one form of an imaging module provided with an EDoF optical system. It is a figure which shows an example of an EDoF optical system.
- FIG. 21 is a flowchart illustrating an example of restoration processing in the restoration processing block illustrated in FIG. 20.
- FIG. It is a figure which shows other embodiment of the imaging device which is 1 aspect of this invention. It is a block diagram which shows the structure of the imaging device shown in FIG.
- the demosaic processing unit and the demosaic processing unit have the same meaning
- the luminance system image data acquisition unit and the luminance system image data acquisition unit have the same meaning
- the point image restoration processing execution unit and the point image restoration processing execution unit are
- the information acquisition unit and the information acquisition unit have the same meaning
- the point image restoration processing control unit and the point image restoration processing control unit have the same meaning
- the image analysis unit and the image analysis unit have the same meaning. is there.
- FIG. 1 is a block diagram showing an embodiment of an imaging apparatus 10 having an image processing apparatus (in FIG. 1, display as an image processing unit) 28 according to the present invention.
- the imaging device 10 having the image processing unit 28 of the present invention is a digital camera that records a captured image in an internal memory (memory unit 26) or an external recording medium (not shown). Centralized control is performed by a central processing unit (CPU) 12.
- CPU central processing unit
- the imaging device 10 is provided with an operation unit 14 including a shutter button or shutter switch, a mode dial, a playback button, a MENU / OK key, a cross key, a zoom button, a BACK key, and the like.
- a signal from the operation unit 14 is input to the CPU 12, and the CPU 12 controls each circuit of the imaging apparatus 10 based on the input signal.
- the lens unit 18, the shutter 20, and the image acquisition unit are provided via the device control unit 16.
- it In addition to controlling the functioning image sensor 22, it performs shooting operation control, image processing control, image data recording / playback control, display control of the display unit 25, and the like.
- the lens unit 18 includes a focus lens, a zoom lens, a diaphragm, and the like, and the light flux that has passed through the lens unit 18 and the shutter 20 is imaged on the light receiving surface of the image sensor 22. Further, the lens unit 18 of the imaging device 10 may be replaceable or non-replaceable.
- the image pickup element 22 has a large number of light receiving elements (photodiodes) arranged two-dimensionally, and a subject image formed on the light receiving surface of each photodiode is an amount of signal voltage (or charge) corresponding to the amount of incident light. Is converted to
- FIG. 2 is a diagram showing a form of the image sensor 22, and particularly shows a color filter array arranged on the light receiving surface of the image sensor 22. Note that each photodiode provided with each color filter is a pixel, and data output from each photodiode is pixel data (pixel signal).
- the color filter array of the image sensor 22 shown in FIG. 2 is generally called a Bayer array.
- the Bayer arrangement is a checkered arrangement of colors that are the main components of a luminance signal that requires high resolution, and the remaining colors that are relatively low resolution are not required.
- a color filter array arranged in a shape.
- a green (G) color filter (G filter) having a large contribution to the luminance signal is arranged in a checkered pattern, and red (R) is arranged in the remaining portion.
- R filter red
- a color filter arrangement in which a color filter (R filter) and a blue (B) color filter (B filter) are arranged in a checkered pattern can be mentioned.
- the signal charge accumulated in the image pickup device 22 having the above configuration is read out as a voltage signal corresponding to the signal charge based on a read signal applied from the device control unit 16.
- the voltage signal read from the image sensor 22 is applied to the A / D converter 24, where it is sequentially converted into digital R, G, B signals (pixel data) corresponding to the color filter array. It is stored in the memory unit 26.
- the memory unit 26 includes SDRAM (Synchronous Dynamic Random Access Memory) that is a volatile memory, EEPROM (Electronically Eraseable and Programmable Read Only Memory) that is a rewritable nonvolatile memory, and the SDRAM executes a program by the CPU 12. It is used as a work area at the time, and as a storage area for temporarily storing a captured digital image signal.
- the EEPROM stores a camera control program including an image processing program, pixel defect information of the image sensor 22, various parameters and tables used for image processing, and the like.
- the image processing unit 28 performs predetermined processing such as white balance correction, gamma correction processing, demosaic processing, RGB / YC conversion, contour correction, chromatic aberration correction, and blur correction on the digital image signal once stored in the memory unit 26. Perform signal processing.
- predetermined processing such as white balance correction, gamma correction processing, demosaic processing, RGB / YC conversion, contour correction, chromatic aberration correction, and blur correction.
- the image data processed by the image processing unit 28 is encoded into data for image display by the encoder 30 and is output to the display unit 25 provided on the back of the camera via the driver 32. As a result, the subject image is continuously displayed on the display screen of the display unit 25.
- the CPU 12 starts an AF (Automatic Focus) operation and an AE (Automatic Exposure) operation, and the lens unit 18 via the device control unit 16.
- the focus lens is moved in the direction of the optical axis, and control is performed so that the focus lens comes to the in-focus position.
- the CPU 12 calculates the brightness of the subject (shooting Ev value) based on the image data output from the A / D converter 24 when the shutter button is half-pressed, and the exposure condition (aperture value, shutter speed) based on the shooting Ev value. ).
- the aperture, the charge accumulation time in the shutter 20 and the image sensor 22 are controlled according to the determined exposure condition, and the main imaging is performed. Is done.
- Image data of an RGB mosaic image (image corresponding to the color filter array shown in FIG. 2) read from the image sensor 22 during the main imaging and A / D converted by the A / D converter 24 is stored in the memory unit 26. Temporarily stored.
- the image data temporarily stored in the memory unit 26 is appropriately read out by the image processing unit 28, where white balance correction, gamma correction, demosaic processing, RGB / YC conversion, contour correction, color correction, and the like are performed. Including predetermined signal processing is performed.
- the RGB / YC converted image data (YC data) is compressed according to a predetermined compression format (for example, JPEG (Joint-Photographic-Experts Group) method), and the compressed image data is stored in a predetermined image file (for example, Exif). (Exchangeable image file format) file) is recorded in the internal memory or external memory.
- the color filter array in the image sensor 22 used in the present invention is not limited to that shown in FIG. Various color filter arrangements can be employed as long as the effects of the present invention are not impaired.
- the color filter may further include a color filter of a color that contributes most to the luminance in addition to red, green, and blue.
- the color that contributes most to luminance includes white (transparent).
- FIG. 3 is a principal block diagram showing the internal configuration of the first embodiment of the image processing apparatus (image processing unit) 28 shown in FIG.
- the image processing apparatus 28 shown in FIG. 3 performs a demosaic process on the mosaic image data output from the image sensor 22 and generates a demosaic image data, and a demosaic processor 100.
- the luminance system image data acquisition unit (unit) 105 that acquires luminance system image data that is image data related to luminance
- the luminance system image data acquisition unit (unit) 105 acquired.
- the point image restoration processing execution unit (means) 110 that performs point image restoration processing on the luminance system image data and the photographing information when the subject is photographed, control information related to the execution of the point image restoration processing is acquired.
- the processing operation of the point image restoration processing execution unit 110 Comprising a point image controlling restoration processing control section (means) 120, a.
- a white balance correction unit, a gamma correction unit, and the like are provided in the image processing unit 28, but are not shown in FIG. 3 for convenience of explanation.
- the demosaic processing unit 100 shown in Fig. 3 acquires mosaic image data. Then, the demosaic processing unit 100 performs demosaic processing on the acquired mosaic image data.
- the mosaic image data is also referred to as RAW data, and is data output from the image sensor 22.
- RAW data data output from the image sensor 22.
- the demosaic processing unit 100 may determine the correlation directions in a plurality of pixel signals constituting the mosaic image data based on the color data in the mosaic image data having the highest contribution rate for obtaining the luminance signal.
- the demosaic processing unit 100 may perform edge detection processing on the mosaic image data based on the color data in the mosaic image data having the highest contribution rate for obtaining the luminance signal.
- the color data in the mosaic image data having the highest contribution rate for obtaining the luminance signal is, for example, pixel data corresponding to the G filter.
- demosaic processing is processing for giving R, G, and B signal values to a target pixel by performing interpolation processing using pixel data of a plurality of pixels around the target pixel. is there.
- demosaic processing is also referred to as demosaicing processing, synchronization processing, and color synthesis processing.
- the demosaic processing unit 100 performs demosaic processing on the mosaic image data to generate demosaic image data.
- the luminance system image data acquisition unit 105 shown in FIG. 3 acquires demosaic image data from the demosaic processing unit 100, and acquires luminance system image data based on the demosaic image.
- the luminance system image data means various data having information relating to the luminance of the captured image.
- the value of the luminance signal Y in the color space represented by Y, Cb, Cr, the value of the luminance signal Y in the color space represented by Y, Pb, Pr, or the luminance signal Data with the highest contribution rate to obtain The value of the luminance signal Y in the color space represented by Y, Cb, and Cr is approximated by [Expression 1], and the value of the luminance signal Y in the color space represented by Y, Pb, and Pr is approximated by [Expression 2]. Can be expressed. In the luminance signal Y obtained by [Expression 1] and [Expression 2], it can be said that the G color data has the highest contribution rate for obtaining the luminance signal.
- the point image restoration processing execution unit 110 shown in FIG. 3 executes point image restoration processing on the luminance system image data.
- the point image restoration processing execution unit 110 is controlled by the point image restoration processing control unit 120.
- point image restoration processing performed by the point image restoration processing execution unit 110 will be described. Assuming that the blurred image acquired by capturing the point image is g (x, y), the original point image is f (x, y), and the point spread function (PSF) is h (x, y), A relationship can be expressed.
- R (x, y) f (x, y)
- This R (x, y) is called a restoration filter.
- a least square filter Wiener filter
- a limited deconvolution filter a limited deconvolution filter
- a recursive filter a homomorphic filter, or the like
- one or more restoration filters are stored in the point image restoration processing execution unit 110.
- the point image restoration processing execution unit 110 performs point image restoration processing on the luminance system image data transmitted from the luminance system image data acquisition unit 105 by the restoration filter generated as described above.
- the point image (optical image) transmitted through the photographing lens is formed on the image sensor 22 as a large point image (blurred image).
- a small point image (high resolution image) is restored as shown in FIG.
- the information acquisition unit 115 illustrated in FIG. 3 acquires shooting information, generates (acquires) control information including information on whether or not to perform point image restoration processing, and transmits the control information to the point image restoration processing control unit 120. Send to.
- the information included in the control information is not limited to the information indicating whether or not the point image restoration process is performed.
- the control information includes information such as weakly performing the point image restoration process and adjusting the intensity of the point image restoration process according to the degree of generation of a false signal due to chromatic aberration.
- the shooting information is various information relating to shooting conditions of the subject.
- shooting information include lens information (lens type, full aperture value) used for shooting, and conditions (aperture value, focal length, subject distance, etc.) for shooting a subject.
- the information acquisition unit 115 generates control information (C-1 or C-2) based on the shooting information (A-1, A-2, or A-3), and the control information (C-1 Or C-2) is transmitted to the point image restoration processing control unit 120.
- the shooting information in FIG. 5, when the lens A is used and the aperture value is set to F1.8 (A-1), the lens A is used and the aperture value is set to F4.
- the case of shooting (A-2) and the case of using the lens A and setting the aperture value to F5.6 are listed.
- the information acquisition unit 115 receives information (A-1), which is taken with the lens A and the aperture value set to F1.8, as the shooting information will be described.
- the information acquisition unit 115 that has received the information (A-1) obtained by using the lens A and setting the aperture value to F1.8 and taking the information (A-1) has received the table information about the false signal stored in the information acquisition unit 115 (B ) And the acquired shooting information (A-1). That is, the shooting information is (A-1) shot using the lens A and the aperture value set to F1.8, and the table information (B) regarding the false signal is “use lens A and the aperture value.
- the information acquisition unit 115 determines that the false signal is emphasized in the image photographed with the photographing information (A-1). Then, based on the determination that the false signal is emphasized, the information acquisition unit 115 sends control information (C-1) not to perform the point image restoration process to the point image restoration processing control unit 120.
- the information acquisition unit 115 receives, as shooting information, information that the lens A is used and the shooting is performed with the aperture value set to F4 (A-2).
- the information acquisition unit 115 that has received the information (A-2) obtained by using the lens A and setting the aperture value to F4, and the table information (B) regarding the false signal stored in the information acquisition unit 115,
- the acquired shooting information (A-2) is referred to.
- the shooting information was shot using the lens A and the aperture value was set to F4 (A-2), and the table information (B) regarding the false signal is “Lens A is used and the aperture value is F4.
- the information acquisition unit 115 determines that the false signal is not emphasized in the image shot with the shooting information (A-2) because the false signal is not emphasized when the image is taken as described above. Then, based on the determination that the false signal is not emphasized, the information acquisition unit 115 sends control information (C-2) to perform the point image restoration process to the point image restoration processing control unit 120.
- the information acquisition unit 115 receives the information that the lens A is used and the image is captured with the aperture value set to F5.6 as the shooting information (A-3).
- the information acquisition unit 115 which has received the information (A-3) that was taken using the lens A and the aperture value set to F5.6, is related to the false signal stored in the information acquisition unit 115.
- the table information (B) and the acquired shooting information (A-3) are referred to. In other words, the shooting information was shot using the lens A and the aperture value set to F5.6 (A-3), and the table information (B) regarding the false signal is “use lens A and the aperture value.
- the information acquisition unit 115 determines that the false signal is not emphasized in the image shot with the shooting information (A-3). Then, based on the determination that the false signal is not emphasized, the information acquisition unit 115 sends control information (C-2) to perform the point image restoration process to the point image restoration processing control unit 120.
- the information acquisition unit 115 further transmits the control information (C-1 or C-2) based on the shooting information (A-1, A-2, or A-3) of a different type from that in FIG.
- FIG. 10 is a diagram for describing generation and transmission of control information (C-1 or C-2) to a point image restoration processing control unit 120.
- the shooting information in FIG. 6, when shooting is performed using a lens with an open aperture value of F1.4 (A-1), and shooting is performed using a lens with an open aperture value of F2 (A ⁇ ). 2), and a case where an image is taken using a lens having an open aperture value of F2.4.
- the information acquisition unit 115 receives the information (A-1) taken using a lens with an open aperture value of F1.4 as shooting information will be described.
- the information acquisition unit 115 that has received the information (A-1) taken using a lens with an open aperture value of F1.4 has table information (B) relating to a false signal stored in the information acquisition unit 115, and Reference is made to the acquired photographing information (A-1).
- the shooting information is (A-1) shot using a lens with an open aperture value of F1.4
- the table information (B) regarding the false signal includes “a lens with an open aperture value of F1.4.
- the information acquisition unit 115 determines that the false signal is not emphasized in the image photographed with the photographing information (A-1). Then, based on the determination that the false signal is not emphasized, the information acquisition unit 115 sends control information (C-2) to perform the point image restoration process to the point image restoration processing control unit 120.
- the information acquisition unit 115 receives the information (A-2) that was shot using a lens with an open aperture value of F2 as shooting information.
- the information acquisition unit 115 that has received the information (A-2) taken using a lens having an open aperture value of F2 acquires the table information (B) related to the false signal stored in the information acquisition unit 115 Refer to shooting information (A-2).
- the shooting information is taken using a lens having an open aperture value of F2 (A-2), and the table information (B) regarding the false signal indicates that “when a lens having an open aperture value of F2 is used. Since the false signal is emphasized ”, the information acquisition unit 115 determines that the false signal is emphasized in the image photographed with the photographing information (A-2). Then, based on the determination that the false signal is emphasized, the information acquisition unit 115 sends control information (C-1) not to perform the point image restoration process to the point image restoration processing control unit 120.
- the information acquisition unit 115 receives information (A-3) taken using a lens with an open aperture value of F2.4 as photographing information.
- the information acquisition unit 115 that has received the information (A-3) taken using a lens having an open aperture value of F2.4 has table information (B) related to a false signal stored in the information acquisition unit 115, and Reference is made to the acquired shooting information (A-3).
- the shooting information is taken using a lens with an open aperture value of F2.4 (A-3), and the table information (B) regarding the false signal includes “a lens with an open aperture value of F2.4. Since the false signal is emphasized when it is used, the information acquisition unit 115 determines that the false signal is emphasized in the image photographed with the photographing information (A-2). Then, based on the determination that the false signal is emphasized, the information acquisition unit 115 sends control information (C-1) not to perform the point image restoration process to the point image restoration processing control unit 120.
- the point image restoration processing control unit 120 shown in FIG. 3 acquires the control information sent from the information acquisition unit 115 and controls the point image restoration processing execution unit 110 based on the control information. Specifically, the point image restoration processing control unit 120 determines whether or not to perform the point image restoration processing included in the acquired control information, and controls the point image restoration processing execution unit 110. If the point image restoration processing control unit 120 determines that the point image restoration processing is not performed based on the control information, the point image restoration processing control unit 120 sends the point image restoration processing execution unit 110 to the luminance system image data. It is prohibited to perform point image restoration processing.
- the point image restoration processing control unit 120 determines that the point image restoration processing is performed based on the control information
- the point image restoration processing control unit 120 sends the point image restoration processing execution unit 110 to the luminance system image data.
- chromatic aberration magnification chromatic aberration and axial chromatic aberration
- magnification chromatic aberration and axial chromatic aberration is caused by the fact that the magnification of the image is different and the size of the image is different for each color, and the focal length of the lens is different because the wavelength of light of each color is different.
- One criterion for determining that chromatic aberration has occurred may be that chromatic aberration has occurred when the data of each color is larger than the kernel size (minimum array pattern) of the color filter array.
- FIGS. 7, 8, 9, and 10 show examples in which a Bayer array color filter is used.
- FIG. 7 shows a demosaic process when no false signal due to chromatic aberration occurs.
- FIG. 7A shows a mosaic image (RAW data) in which 4 pixel data (vertical) ⁇ 4 pixel data (horizontal) is collected.
- the mosaic image data shown in part (A) of FIG. 7 represents an image having an edge in the vertical direction. That is, 4 pixel data (vertical) ⁇ 2 pixel data (horizontal) on the left side toward the part (A) in FIG. 7 across the edge is a collection of R, G, and B pixel data whose output values are 0. It is.
- 4 pixel data (vertical) ⁇ 2 pixel data (horizontal) on the right side of FIG. 7A with respect to the edge is a collection of R, G, and B pixel data whose output value is 1. It is.
- the mosaic image of the (A) part of FIG. 7 is demosaiced, and color data of three surfaces R, G, and B is generated. It is shown that. Since the mosaic image in part (A) in FIG. 7 has an edge as a boundary line and the output value of pixel data is divided into 0 and 1, the part (B), part (C) in FIG. Also in the color data of the three surfaces in (D), the output value of the pixel data is divided into 0 and 1 with the edge as a boundary line.
- FIG. 8 shows a state of demosaic processing when a false signal is generated due to chromatic aberration and demosaic processing.
- the portion (A) in FIG. 8 should originally be the same data as the portion (A) in FIG. 7, but is different from the portion (A) in FIG. 7 due to the influence of chromatic aberration. That is, FIG. 8A shows a mosaic image (RAW data) in which 4 pixel data (vertical) ⁇ 4 pixel data (horizontal) is collected, but is shown in FIG. 7A.
- the 4 pixel data (vertical) ⁇ 2 pixel data (horizontal) on the left toward the part (A) in FIG. 8 across the edge is different from the part (A) in FIG.
- the G pixel data originally adjacent to an edge shows an output value of 1 due to the influence of chromatic aberration.
- the sampling frequency of the G pixel is higher than that of the R pixel and B pixel, and the contribution rate of the G pixel data to the luminance system signal is higher than the pixel data of the R pixel and B pixel. Therefore, in the demosaic process, generally, a process for determining the correlation direction in the image data based on the G pixel data, an edge is detected based on the determined correlation direction, and an interpolation process in consideration of the detected edge is performed. .
- color shift due to chromatic aberration occurs (pixel data of G pixel is 1) (see reference numerals 40 and 42 in FIG. 8).
- FIG. 10 shows a phenomenon in which false signals due to chromatic aberration and demosaic processing are enhanced by performing point image restoration processing.
- the (A) part, (B) part, and (C) part of FIG. 10 are the color data of the three surfaces corresponding to the (B) part, (C) part, and (D) part of FIG.
- the edges that are originally linear are uneven due to the influence of false signals due to chromatic aberration and demosaic processing.
- part of FIG. 10 shows a case where point image restoration processing is performed on G color data (part (B) of FIG. 10) which is one example of luminance system data.
- the uneven edges of the G color data (part (B) of FIG. 10) are emphasized by the point image restoration process. That is, the pixel data indicating the output value 1 shown in part (B) of FIG. 10 is amplified to 3 by the point image restoration process (part (D) of FIG. 10), and the pixel data shown in FIG. In the pixel data indicating the output value 0 shown in the part (B), the output value of the pixel data remains 0 even if the point image restoration process is performed. Therefore, in FIG. 10D where the point image restoration processing has been performed, the false signal due to chromatic aberration is emphasized (a larger step is generated).
- an edge is detected based on the process of determining the correlation direction in the image data based on the G pixel data in the demosaic process or the determined correlation direction. Since the interpolation processing in consideration of the detected edge is included, if chromatic aberration occurs in the optical system, a false signal due to the demosaic processing is likely to occur, and this false signal is emphasized by the restoration processing. This problem is caused by the optimal restoration filter for each color of R, G, and B. In a system that performs restoration processing for each color data of R, G, and B, chromatic aberration is obtained by performing point image restoration processing for each color data of R, G, and B. Therefore, it is considered to be unique to the point image restoration process for luminance system data after demosaic processing.
- FIG. 11 shows a processing flow of the image processing device 28.
- mosaic image data is input to the demosaic processing unit 100 (step S10).
- the demosaic processing unit 100 performs demosaic processing on the mosaic image data (step S15), and generates demosaic image data (demosaic processing step).
- luminance system image data is acquired from the demosaic image data by the luminance system image data acquisition unit 105 (step S20) (luminance system image data acquisition step).
- point image restoration processing is performed on the luminance system image data by the point image restoration processing execution unit 110 (step S25) (point image restoration processing execution step).
- shooting information is acquired by the information acquisition unit 115 (step S30) (information acquisition step). Then, control information is generated by the information acquisition unit 115 (step S35). Thereafter, control information is acquired by the point image restoration processing control unit 120 (step S40).
- the point image restoration processing control unit 120 controls point image restoration processing (point image restoration processing execution step) performed by the point image restoration processing execution unit 110 (point image restoration processing control step).
- FIG. 12 is a diagram illustrating an example of control of the point image restoration processing execution unit 110 performed by the point image restoration processing control unit 120. That is, A-1 to A-4 are execution examples of point image restoration processing for luminance system image data when point image restoration processing is performed as control information. B-1 to B-4 are examples in the case where point image restoration processing is not performed as control information. Reference numerals 50, 52, 54, 56, 68, and 78 denote portions that have been subjected to point image restoration processing. Reference numerals 58, 60, 62, 64, and 66 denote portions where the point image restoration processing is not performed.
- Reference numerals 70, 72, 74, and 76 denote portions (areas) where the point image restoration process that is weaker than the point image restoration processes executed in 50, 52, 54, 56, 68, and 78 is executed.
- Reference numeral 80 denotes a case where the strength of the point image restoration process is changed stepwise.
- Reference numeral 80 denotes that strong point image restoration processing is performed according to the color density.
- FIG. 4 shows a state in which point image restoration processing is performed on the entire surface of the luminance system image data.
- the information acquisition unit 115 sends control information not to perform the point image restoration processing to the point image restoration processing control unit 120, and the point image restoration processing control unit 120 It is shown that the execution unit 110 prohibits the point image restoration process for the luminance system image data.
- the point image restoration processing control unit 120 switches the control of the point image restoration processing execution unit 110 according to the control information like A-1 and B-1, as described above.
- the point image restoration processing control unit 120 prohibits the point image restoration processing execution unit 110 from performing point image restoration processing on the portion where the false signal due to chromatic aberration is enhanced (fake signal enhancement region) (FIG. 12B-2, reference numeral 60).
- Reference numeral 62, reference numeral 64, reference numeral 66), and other parts (areas other than the false signal enhancement area) are subjected to point image restoration processing (see reference numeral 68 in FIG. 12B-2).
- the point image restoration processing control unit 120 switches the control of the point image restoration processing execution unit 110 according to the control information like A-2 and B-2 as described above.
- the point image restoration processing control unit 120 causes the point image restoration processing execution unit 110 to perform weak point image restoration processing on the portion where the false signal due to chromatic aberration is emphasized (false signal enhancement region) (FIG. 12B-3, Reference points 70, 72, 74, and 76) and other portions (regions other than the false signal enhancement region) are subjected to point image restoration processing stronger than weak point image restoration processing (FIG. 12B-3, reference number 78).
- the point image restoration processing control unit 120 switches control of the point image restoration processing execution unit 110 according to control information such as A-3 and B-3.
- the point image restoration processing control unit 120 causes the point image restoration processing execution unit 110 to change the strength of the point image restoration processing according to the degree of enhancement of the false signal due to chromatic aberration (B-4 in FIG. 12). .
- the point image restoration processing control unit 120 switches the control of the point image restoration processing execution unit 110 according to the control information as in A-4 and B-4 as described above.
- the control example of the point image restoration processing control unit 120 is not limited to the above-described control example, and the point image restoration processing can be performed in consideration of the degree of generation of a false signal due to chromatic aberration.
- the false signal enhancement region may be a region having a high image height in the image. Further, in the above-described point image restoration processing control example, the false signal enhancement degree may increase according to the height of the image height.
- the point image restoration processing control unit 120 can adjust and control the strength of the point image restoration processing performed by the point image restoration processing execution unit 110.
- adjusting the strength of the point image restoration process can adjust the strength of the point image restoration process by adjusting the coefficient of the restoration filter.
- FIG. 13 shows an image diagram of a plurality of point spread functions (PSF) from A to F. As described above, a restoration filter is generated based on these PSFs.
- PSF point spread functions
- the spread of the PSF increases gradually in the order of A, B, C, D, E, and F. Then, the point image restoration process using the restoration filter using the PSF of A is weaker than the point image restoration process using the restoration filter using the PSF having a larger spread than the PSF of A (for example, the PSF of B). It becomes.
- the point image restoration processing execution unit 110 performs point image restoration processing (the lens used by the PSF indicated by F in FIG. 13). If it was PSF).
- a restoration filter corresponding to the PSF shown in A of FIG. 13 is used for a portion where the false signal due to chromatic aberration is emphasized (false signal enhancement region). Then, the point image restoration processing execution unit 110 performs weak point image restoration processing.
- FIG. 14 shows a second embodiment of the image processing device 28.
- the image analysis unit 111 is included in the demosaic processing unit 100.
- the image analysis unit 111 performs image analysis on the mosaic image data before the demosaic image processing unit performs the demosaic process or the demosaic image data after the demosaic image process.
- the image analysis unit 111 is installed in the demosaic processing unit 100 and performs image analysis on the mosaic image data or the demosaic image data.
- the present invention is not limited to this. It is not something.
- the image analysis can be performed with various data as long as the generation and generation degree of the false signal due to chromatic aberration can be analyzed.
- the image analysis is, as described with reference to FIGS. 7 to 10 as an example, whether or not the image analysis is mosaic image data in which a false signal due to chromatic aberration is emphasized when point image restoration processing is performed, or point image restoration.
- processing it is specified whether or not the demosaic image data emphasizes a false signal due to chromatic aberration.
- the image analysis unit 111 obtains the magnitude of contrast in the mosaic image data or the demosaic image data to determine whether the image is mosaic image data in which a false signal is emphasized when the point image restoration process is performed, or the point image restoration process. Is performed, it is analyzed whether the demosaic image data emphasizes the false signal.
- the image analysis unit 111 identifies an image having a high contrast portion as an image in which a false signal due to chromatic aberration is emphasized. Furthermore, an image having an edge portion with a high contrast is also identified as an image in which false signals due to chromatic aberration and demosaic processing are enhanced.
- the high-contrast portion can be represented by a contrast ratio. For example, 8-bit 130: 1 to 170: 1, preferably 140: 1 to 160: 1, and more preferably 145: 1 to 155. A portion having a contrast ratio in the range of: 1.
- the image analysis unit 111 analyzes the mosaic image data or the demosaic image data to identify the portion where the false signal due to the chromatic aberration and the demosaic process is emphasized and / or the occurrence of the false signal due to the chromatic aberration as analysis information. It transmits to the information acquisition part 115.
- the information acquisition unit 115 generates control information based on the analysis information and the imaging information. Specifically, it is specified whether or not the false signal due to chromatic aberration is emphasized by referring to the acquired photographing information and table information related to the false signal. If the false information due to chromatic aberration is not emphasized according to the photographing information, the information acquisition unit 115 further considers the analysis information and performs the point image restoration process to determine whether the false signal due to chromatic aberration is enhanced. If it is determined that the false signal due to chromatic aberration is not enhanced from the photographing information and analysis information, control information for performing point image restoration processing is generated and transmitted (in the case of A-1 in FIG. 12).
- control information is generated that does not perform point image restoration processing. And transmit (in the case of B-1 in FIG. 12).
- the information acquisition unit 115 may generate control information that does not perform the point image restoration process without considering the analysis information. Good. Further, even if it is assumed that the false signal due to chromatic aberration is emphasized according to the photographing information, information on the area where the false signal due to chromatic aberration is generated may be included in the control information in consideration of the analysis information. .
- the information acquisition unit 115 acquires the analysis information, and thus more accurately separates the area (part) where the point image restoration processing is performed and the area (part) where it is not performed. You can also. That is, when the false signal due to chromatic aberration is emphasized, the information acquisition unit 115 identifies which part the false signal due to chromatic aberration is emphasized by the analysis information, and generates control information in consideration of the part. For example, the point image restoration process is not performed on the portion where the false signal due to chromatic aberration is enhanced, and the point image restoration process is not performed on the other portion (B-2 in FIG. 12), or the false signal due to chromatic aberration is enhanced.
- the portion to be processed is subjected to weak point image restoration processing, and the other portion is controlled to perform normal point image restoration processing or strong point image restoration processing (B-3 in FIG. 12), and emphasis of false signals due to chromatic aberration Control (B-4 in FIG. 12) for changing the strength of the point image restoration process according to the degree can be performed.
- FIG. 15 is a flowchart of the second embodiment of the image processing apparatus 28. Portions that are the same as those in the flowchart of the first embodiment of the image processing apparatus 28 shown in FIG. In the flowchart of the second embodiment of the image processing device 28 shown in FIG. 15, the flowchart of the second embodiment is compared with the flowchart of the second embodiment of the image processing device 28 shown in FIG. Is different in that the image analysis unit 111 performs image analysis (step S31) and the information acquisition unit 115 also acquires analysis information (step S33).
- the image analysis unit 111 performs image analysis on the mosaic image data (step S31). Note that the image analysis unit 111 may perform image analysis on demosaic image data or may perform image analysis on luminance system image data.
- the image analysis unit 111 performs image analysis (step S31), generates image analysis information based on the result of the image analysis, and sends the image analysis information to the information acquisition unit 115. Then, image analysis information is acquired by the information acquisition unit 115 (step S33).
- FIG. 16 shows a third embodiment of the image processing device 28.
- the luminance signal Y is a luminance signal Y in a color space represented by the luminance signal Y and the color difference signals Cb and Cr.
- FIG. 17 shows a fourth embodiment of the image processing device 28.
- point image restoration processing is performed on G color data, which is a specific example of the system image data. Since the G color data is a value that contributes most when generating the value of the luminance signal Y (see Equations 1 and 2), the point image restoration process is performed on the G color data to obtain more accuracy. High point image restoration processing can be performed.
- FIG. 18 is a diagram illustrating a modification of the image sensor 22.
- a color filter array (“X-trans” (registered trademark)) as a modified example of the color filter array described in FIG. 2 is shown with respect to the color filter array arranged on the light receiving surface of the image sensor 22.
- X-trans registered trademark
- FIG. 18 it is possible to employ various color filter arrangements for the image sensor 22, and FIG.
- the color filter array of the image sensor 22 includes a basic array pattern P (pattern indicated by a thick frame) composed of a square array pattern corresponding to 6 ⁇ 6 pixels, and the basic array pattern P is repeated in the horizontal direction and the vertical direction. Is arranged. That is, in this color filter array, filters of each color (R filter, G filter, B filter) of red (R), green (G), and blue (B) are arranged with a predetermined periodicity. Since the R filter, G filter, and B filter are arranged with a predetermined periodicity as described above, RGB RAW data (mosaic image data) read from the image sensor 22 is compared with a conventionally known random arrangement. When performing the image processing or the like, the processing can be performed according to a repetitive pattern.
- P pattern indicated by a thick frame
- the G filter corresponding to the color (G color) that contributes most to obtain the luminance signal is the horizontal, vertical, and diagonally upper right corners of the color filter array in the basic array pattern.
- NE and one or more lines in each line in the diagonally upper left (NW) direction.
- the G filter corresponding to the luminance system pixel is arranged in each line in the horizontal, vertical, and diagonal (NE, NW) directions of the color filter array, the synchronization processing in the high frequency region is performed regardless of the direction of high frequency.
- the reproduction accuracy of (demosaic processing) can be improved.
- the color filter array shown in FIG. 18 includes an R filter and a B filter corresponding to two or more other colors (in this embodiment, R and B colors) other than the G color.
- one or more color filters are arranged in each horizontal and vertical line of the color filter array.
- the basic array pattern P of the color filter array shown in FIG. 18 has 8 pixels and 20 pixels, respectively, corresponding to the R, G, and B filters in the basic array pattern. , 8 pixels. That is, the ratio of the number of pixels of RGB pixels is 2: 5: 2, and the ratio of the number of G pixels that contributes most to obtain a luminance signal is the ratio of R pixels and B pixels of other colors. It is larger than the ratio of the number of pixels.
- the ratio between the number of G pixels and the number of R and B pixels is different, and in particular, the ratio of the number of G pixels that contributes most to obtain a luminance signal is equal to the number of R and B pixels. Since the ratio is larger than the ratio, aliasing at the time of the synchronization process can be suppressed and high frequency reproducibility can be improved.
- FIG. 19 shows a state in which the basic array pattern P shown in FIG. 18 is divided into 4 ⁇ 3 ⁇ 3 pixels.
- the basic array pattern P includes a 3 ⁇ 3 pixel A array surrounded by a solid frame and a 3 ⁇ 3 pixel B array surrounded by a broken frame alternately in the horizontal and vertical directions. It can also be understood that the array is arranged.
- the G filters are arranged at the four corners and the center, respectively, and arranged on both diagonal lines.
- the R filter is arranged in the horizontal direction with the central G filter interposed therebetween, and the B filter is arranged in the vertical direction.
- the B filter is arranged in the horizontal direction with the central G filter interposed therebetween.
- the R filters are arranged in the vertical direction. That is, in the A array and the B array, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
- the G filters at the four corners of the A array and the B array become a square array G filter corresponding to 2 ⁇ 2 pixels by alternately arranging the A array and the B array in the horizontal and vertical directions.
- the object of the present invention is from a storage medium (non-transitory recording medium) storing program codes (programs) for realizing the flow procedure shown in the above-described embodiment, to a computer ( Alternatively, it is also achieved by a CPU or MPU (Micro-Processing Unit) reading and executing the program code.
- the present invention can also be provided as a computer program product that stores executable code for using the method according to the present invention.
- the program code itself read from the storage medium realizes the functions of the above-described embodiment. Therefore, the program code and a computer-readable storage medium storing / recording the program code also constitute one aspect of the present invention.
- a storage medium for supplying the program code for example, floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD-ROM (Compact Disc Read Only Memory), CD-R (Compact Disc Recordable), magnetic A tape, a nonvolatile memory card, a ROM (Read Only Memory), or the like can be used.
- the functions of the above-described embodiments are realized by the computer executing the read program.
- the execution of the program includes a case where an OS (Operating System) running on the computer performs part or all of the actual processing based on an instruction of the program.
- OS Operating System
- the functions of the above-described embodiments can also be realized by a function expansion board inserted into a computer or a function expansion unit connected to a computer.
- the program read from the storage medium is written in a memory provided in a function expansion board inserted into the computer or a function expansion unit connected to the computer.
- the CPU provided in the function expansion board or function expansion unit performs part or all of the actual processing.
- the functions of the above-described embodiment are also realized by processing by such a function expansion board or function expansion unit.
- each step of the flow of the above-described embodiment is not limited to that realized using software (computer), and may be realized using hardware (electronic circuit).
- the point image restoration processing performed by the point image restoration processing execution unit 110 in the above-described embodiment restores point spread (point image blur) according to specific shooting conditions (for example, aperture value, focal length, lens type, etc.).
- specific shooting conditions for example, aperture value, focal length, lens type, etc.
- the image processing to which the present invention can be applied is not limited to the point image restoration processing in the above-described embodiment.
- the present invention also applies to point image restoration processing for image data captured and acquired by an optical system (such as a photographing lens) having an expanded depth of field (focal depth) (EDoF: Extended Depth of Field (Focus)). It is possible to apply the point image restoration processing according to the invention.
- the restoration filter is based on the point spread function (PSF, OTF (optical transfer function), MTF (modulation transfer function, magnitude transfer function), PTF (phase transfer function), etc.) of the EDoF optical system.
- point image restoration processing using a restoration filter having a filter coefficient set so as to enable good image restoration within the range of the depth of field (depth of focus) is performed.
- the point image restoration process is performed on the luminance signal (luminance signal Y) obtained from the image data (RGB data) after the demosaic process.
- FIG. 20 is a block diagram illustrating an embodiment of an imaging module 201 including an EDoF optical system.
- the imaging module (digital camera or the like) 201 of this example includes an EDoF optical system (lens unit) 210, an imaging element 212, an AD conversion unit 214, and a point image restoration processing block (image processing unit) 220.
- FIG. 21 is a diagram illustrating an example of the EDoF optical system 210.
- the EDoF optical system 210 of this example includes a photographic lens 210A having a single focal point and an optical filter 211 arranged at the pupil position.
- the optical filter 211 modulates the phase, and converts the EDoF optical system 210 (the photographing lens 210A) to EDoF so that an enlarged depth of field (depth of focus) (EDoF) is obtained.
- the photographing lens 210A and the optical filter 211 constitute a lens unit that modulates the phase and expands the depth of field.
- the EDoF optical system 210 includes other components as necessary.
- a diaphragm (not shown) is provided in the vicinity of the optical filter 211.
- the optical filter 211 may be one sheet or a combination of a plurality of sheets.
- the optical filter 211 is merely an example of an optical phase modulation unit, and the EDoF conversion of the EDoF optical system 210 (the photographing lens 210A) may be realized by other units.
- the EDoF optical system 210 may be made EDOF by the photographing lens 210A designed to have the same function as the optical filter 211 of this example.
- EDoF conversion of the EDoF optical system 210 can be realized by various means for changing the wavefront of the image formed on the light receiving surface of the image sensor 212.
- optical element whose thickness changes “optical element whose refractive index changes (refractive index distributed wavefront modulation lens, etc.)”, “optical element whose thickness and refractive index change due to coding on the lens surface (wavefront) "Modulation hybrid lens, optical element formed as a phase plane on the lens surface, etc.)”, “liquid crystal element capable of modulating phase distribution of light (liquid crystal spatial phase modulation element, etc.)” > It can be adopted as DDoF conversion means.
- the present invention can also be applied to a case that can be formed by the photographing lens 210A itself without using a modulation element.
- the EDoF optical system 210 shown in FIG. 21 can be reduced in size because it can omit the focus adjustment mechanism that mechanically adjusts the focus, and can be suitably mounted on a mobile phone with a camera or a portable information terminal.
- the optical image after passing through the EDoF-converted EDoF optical system 210 is formed on the image sensor 212 shown in FIG. 20, and is converted into an electrical signal here.
- the image sensor 212 is composed of a plurality of pixels arranged in a matrix with a predetermined pattern arrangement (Bayer arrangement, G stripe R / G complete checkered pattern, X-Trans arrangement, honeycomb arrangement, etc.). It includes a color filter (RGB color filter in this example) and a photodiode.
- the optical image incident on the light receiving surface of the image sensor 212 via the EDoF optical system 210 is converted into a signal charge corresponding to the amount of incident light by each photodiode arranged on the light receiving surface.
- the R, G, and B signal charges accumulated in each photodiode are sequentially output as a voltage signal (image signal) for each pixel.
- the AD converter 214 converts an analog R / G / B image signal output from the image sensor 212 for each pixel into a digital RGB image signal.
- the digital image signal converted into a digital image signal by the AD conversion unit 214 is added to the point image restoration processing block 220.
- the point image restoration processing block 220 includes, for example, a black level adjustment unit 222, a white balance gain unit 223, a gamma processing unit 224, a demosaic processing unit 225, an RGB / YCrCb conversion unit 226, and a luminance signal Y point image restoration. And a processing unit 227.
- the black level adjustment unit 222 performs black level adjustment on the digital image signal output from the AD conversion unit 214.
- a known method can be adopted for black level adjustment. For example, when attention is paid to a certain effective photoelectric conversion element, an average of dark current amount acquisition signals corresponding to each of a plurality of OB photoelectric conversion elements included in the photoelectric conversion element row including the effective photoelectric conversion element is obtained, and the effective The black level is adjusted by subtracting the average from the dark current amount acquisition signal corresponding to the photoelectric conversion element.
- the white balance gain unit 223 performs gain adjustment according to the white balance gain of each RGB color signal included in the digital image signal in which the black level data is adjusted.
- the gamma processing unit 224 performs gamma correction that performs gradation correction such as halftone so that the R, G, and B image signals that have undergone white balance adjustment have desired gamma characteristics.
- the demosaic processing unit 225 performs demosaic processing on the R, G, and B image signals after the gamma correction. Specifically, the demosaic processing unit 225 performs a color interpolation process on the R, G, and B image signals, thereby generating a set of image signals (R signal, G signal) output from each light receiving pixel of the image sensor 212. , B signal). That is, before the color demosaicing process, the pixel signal from each light receiving pixel is one of the R, G, and B image signals, but after the color demosaicing process, the R, G, B signal corresponding to each light receiving pixel is displayed. A set of three pixel signals is output.
- the RGB / YCrCb conversion unit 226 converts the demosaic R, G, and B signals for each pixel into a luminance signal Y and color difference signals Cr and Cb, and outputs the luminance signal Y and the color difference signals Cr and Cb for each pixel. To do.
- the luminance signal Y point image restoration processing unit 227 performs point image restoration processing on the luminance signal Y from the RGB / YCrCb conversion unit 226 based on a restoration filter stored in advance.
- the restoration filter corresponding to the optical filter 211 is stored in a memory (not shown) (for example, a memory provided with the luminance signal Y point image restoration processing unit 227).
- the kernel size of the deconvolution kernel is not limited to 7 ⁇ 7.
- FIG. 22 is a flowchart showing an example of the point image restoration process in the point image restoration processing block 220 shown in FIG.
- the digital image signal is added to the one input of the black level adjustment unit 222 from the AD conversion unit 214, the black level data is added to the other input, and the black level adjustment unit 222 receives the digital image signal.
- the black level data is subtracted from the digital image signal, and the digital image signal from which the black level data has been subtracted is output to the white balance gain unit 223 (step S1). As a result, the black level component is not included in the digital image signal, and the digital image signal indicating the black level becomes zero.
- the image data after black level adjustment is sequentially processed by the white balance gain unit 223 and gamma processing unit 224 (steps S2 and S3).
- the R, G, and B signals subjected to gamma correction are demosaiced by the demosaic processing unit 225, and then converted to the luminance signal Y and the chroma signals Cr and Cb by the RGB / YCrCb conversion unit 226 (step S4).
- the luminance signal Y point image restoration processing unit 227 performs a point image restoration process in which the luminance signal Y is subjected to a deconvolution process corresponding to the phase modulation of the optical filter 211 of the EDoF optical system 210 (step S5). That is, the luminance signal Y point image restoration processing unit 227 stores a luminance signal corresponding to a predetermined unit pixel group centered on an arbitrary pixel to be processed (in this case, a luminance signal of 7 ⁇ 7 pixels) in advance in a memory or the like. Deconvolution processing (deconvolution calculation processing) is performed on the stored restoration filter (7 ⁇ 7 deconvolution kernel and its calculation coefficient).
- the luminance signal Y point image restoration processing unit 227 performs point image restoration processing to remove the image blur of the entire image by repeating the deconvolution processing for each pixel group of a predetermined unit so as to cover the entire area of the imaging surface.
- the restoration filter is determined according to the position of the center of the pixel group to be subjected to the deconvolution process. That is, a common restoration filter is applied to adjacent pixel groups. Furthermore, in order to simplify the point image restoration process, it is preferable to apply a common restoration filter to all the pixel groups.
- the point image (optical image) of the luminance signal after passing through the EDoF optical system 210 is a large point image (
- a small point image (high-resolution image) is formed as shown in FIG. ) Is restored.
- the point image restoration process As described above, by applying the point image restoration process to the luminance signal after the demosaic process, it is not necessary to separately provide the point image restoration process parameters for RGB, and the point image restoration process can be speeded up. Further, R, G, B image signals corresponding to R, G, B pixels at the jumping positions are not combined into one unit and deconvolved, but the luminance signals of adjacent pixels are set to a predetermined unit. In summary, since the deconvolution processing is performed by applying a common restoration filter to the unit, the accuracy of the point image restoration processing is improved. Note that the color difference signals Cr and Cb are acceptable in terms of image quality even if the resolution is not increased by point image restoration processing due to the visual characteristics of human eyes.
- the point image restoration processing according to each embodiment of the present invention can also be applied to the point image restoration processing of the EDoF system as described above.
- imaging apparatus including the image processing apparatus 28 of the present invention
- a digital camera has been described in FIG. 1, but the configuration of the imaging apparatus is not limited to this.
- another imaging device of the present invention for example, a built-in type or an external type PC camera or a portable terminal device having a shooting function as described below can be used.
- Examples of the portable terminal device that is an embodiment of the photographing apparatus of the present invention include a mobile phone, a smartphone, a PDA (Personal Digital Assistants), and a portable game machine.
- a smart phone multifunction mobile phone
- FIG. 23 shows an appearance of a smartphone 301 which is an embodiment of the photographing apparatus of the present invention.
- a smartphone 301 illustrated in FIG. 23 includes a flat housing 302, and a display input in which a display panel 321 as a display unit and an operation panel 322 as an input unit are integrated on one surface of the housing 302.
- the unit 320 is provided.
- the housing 302 includes a speaker 331, a microphone 332, an operation unit 340, and a camera unit 341. Note that the configuration of the housing 302 is not limited to this, and, for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide mechanism may be employed.
- FIG. 24 is a block diagram showing a configuration of the smartphone 301 shown in FIG.
- the main components of the smartphone include a wireless communication unit 310, a display input unit 320, a call unit 330, an operation unit 340, a camera unit 341, a storage unit 350, and an external input / output unit. 360, a GPS (Global Positioning System) receiving unit 370, a motion sensor unit 380, a power supply unit 390, and a main control unit 400.
- a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided as a main function of the smartphone 301.
- the image processing unit 28 described above mainly has a mode belonging to the main control unit 400, but is not limited thereto.
- the wireless communication unit 310 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 400. Using such wireless communication, transmission / reception of various file data such as audio data and image data, e-mail data, and reception of Web data, streaming data, and the like are performed.
- the display input unit 320 controls the main control unit 400 to display images (still images and moving images), character information, etc., visually transmit information to the user, and detect user operations on the displayed information.
- a so-called touch panel which includes a display panel 321 and an operation panel 322.
- the display panel 321 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
- the operation panel 322 is a device that is placed so that an image displayed on the display surface of the display panel 321 is visible and detects one or more coordinates operated by a user's finger or stylus.
- a detection signal generated due to the operation is output to the main control unit 400.
- the main control unit 400 detects an operation position (coordinates) on the display panel 321 based on the received detection signal.
- the display panel 321 and the operation panel 322 of the smartphone 301 illustrated as an embodiment of the photographing apparatus of the present invention integrally constitute a display input unit 320, but the operation panel
- the arrangement is such that 322 completely covers the display panel 321.
- the operation panel 322 may have a function of detecting a user operation even in an area outside the display panel 321.
- the operation panel 322 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 321 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 321. May be included).
- the operation panel 322 may include two sensitive regions of the outer edge portion and the other inner portion. Furthermore, the width of the outer edge portion is appropriately designed according to the size of the housing 302 and the like. Furthermore, examples of the position detection method employed in the operation panel 322 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
- the call unit 330 includes a speaker 331 and a microphone 332, converts user's voice input through the microphone 332 into voice data that can be processed by the main control unit 400, and outputs the voice data to the main control unit 400, or a wireless communication unit. 310 or the audio data received by the external input / output unit 360 is decoded and output from the speaker 331.
- the speaker 331 can be mounted on the same surface as the display input unit 320 and the microphone 332 can be mounted on the side surface of the housing 302.
- the operation unit 340 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 340 is mounted on the side surface of the housing 302 of the smartphone 301 and is turned on when pressed with a finger or the like, and turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
- the storage unit 350 includes a control program and control data of the main control unit 400, application software, address data that associates the name and telephone number of a communication partner, transmitted / received e-mail data, Web data downloaded by Web browsing, The downloaded content data is stored, and streaming data and the like are temporarily stored.
- the storage unit 350 includes an internal storage unit 351 built in the smartphone and an external storage unit 352 having a removable external memory slot.
- Each of the internal storage unit 351 and the external storage unit 352 constituting the storage unit 350 includes a flash memory type (hard memory type), a hard disk type (hard disk type), a multimedia card micro type (multimedia card micro type), This is realized using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
- a flash memory type hard memory type
- hard disk type hard disk type
- multimedia card micro type multimedia card micro type
- a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
- the external input / output unit 360 serves as an interface with all external devices connected to the smartphone 301, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network.
- external devices for example, universal serial bus (USB), IEEE 1394, etc.
- a network for example, Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared Data Association (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark) ZigBee) (registered trademark, etc.) for direct or indirect connection.
- an external device connected to the smartphone 301 for example, a wired / wireless headset, a wired / wireless external charger, a wired / wireless data port, a memory card (Memory card) connected via a card socket, or a SIM (Subscriber).
- Identity Module Card / UIM User Identity Module Card
- external audio / video equipment connected via audio / video I / O (Input / Output) terminal
- external audio / video equipment connected wirelessly yes / no
- the external input / output unit may transmit data received from such an external device to each component inside the smartphone 301, or may allow the data inside the smartphone 301 to be transmitted to the external device. it can.
- the GPS receiving unit 370 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 400, executes positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 301. Detect the position consisting of longitude and altitude.
- the GPS reception unit 370 can acquire position information from the wireless communication unit 310 or the external input / output unit 360 (for example, a wireless LAN), the GPS reception unit 370 can also detect the position using the position information.
- the motion sensor unit 380 includes, for example, a three-axis acceleration sensor, and detects the physical movement of the smartphone 301 in accordance with an instruction from the main control unit 400. By detecting the physical movement of the smartphone 301, the moving direction and acceleration of the smartphone 301 are detected. The detection result is output to the main control unit 400.
- the power supply unit 390 supplies power stored in a battery (not shown) to each unit of the smartphone 301 in accordance with an instruction from the main control unit 400.
- the main control unit 400 includes a microprocessor, operates according to a control program and control data stored in the storage unit 350, and controls each unit of the smartphone 301 in an integrated manner.
- the main control unit 400 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 310.
- the application processing function is realized by the main control unit 400 operating according to application software stored in the storage unit 350.
- Application processing functions include, for example, an infrared communication function that controls the external input / output unit 360 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
- the main control unit 400 also has an image processing function such as displaying video on the display input unit 320 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
- the image processing function refers to a function in which the main control unit 400 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 320.
- the main control unit 400 executes display control for the display panel 321 and operation detection control for detecting a user operation through the operation unit 340 and the operation panel 322.
- the main control unit 400 By executing the display control, the main control unit 400 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
- a software key such as a scroll bar, or a window for creating an e-mail.
- the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 321.
- the main control unit 400 detects a user operation through the operation unit 340 or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 322. Or a display image scroll request through a scroll bar.
- the main control unit 400 causes the operation position with respect to the operation panel 322 to overlap with the display panel 321 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 321.
- a touch panel control function for controlling the sensitive area of the operation panel 322 and the display position of the software key.
- the main control unit 400 can also detect a gesture operation on the operation panel 322 and execute a preset function in accordance with the detected gesture operation.
- Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
- the camera unit 341 is a digital camera that performs electronic photography using an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge-Coupled Device).
- the camera unit 341 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic Coding Experts Group) under the control of the main control unit 400, and records the data in the storage unit 350 or externally.
- the data can be output through the input / output unit 360 and the wireless communication unit 310.
- the camera unit 341 is mounted on the same surface as the display input unit 320.
- the mounting position of the camera unit 341 is not limited to this, and the camera unit 341 may be mounted on the back surface of the display input unit 320. Alternatively, a plurality of camera units 341 may be mounted. In the case where a plurality of camera units 341 are installed, the camera unit 341 used for shooting can be switched for shooting alone, or a plurality of camera units 341 can be used for shooting simultaneously.
- the camera unit 341 can be used for various functions of the smartphone 301.
- an image acquired by the camera unit 341 can be displayed on the display panel 321, or the image of the camera unit 341 can be used as one of operation inputs of the operation panel 322.
- the GPS receiving unit 370 detects a position
- the position can also be detected with reference to an image from the camera unit 341.
- the optical axis direction of the camera unit 341 of the smartphone 301 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
- the image from the camera unit 341 can be used in the application software.
- the position information acquired by the GPS receiving unit 370 on the image data of the still image or the moving image, the voice information acquired by the microphone 332 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 380 can be added and recorded in the storage unit 350 or output through the external input / output unit 360 or the wireless communication unit 310.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
Description
図1は、本発明に係る画像処理装置(図1では画像処理部と表示)28を有する撮像装置10の実施形態を示すブロック図である。
図2は、撮像素子22の形態を示す図であり、特に撮像素子22の受光面上に配置されているカラーフィルタ配列に関して示している。尚、各カラーフィルタが設置された各フォトダイオードを画素とし、各フォトダイードから出力されるデータを画素データ(画素信号)とする。
図3は、図1に示す画像処理装置(画像処理部)28の第1実施形態の内部構成を示す要部ブロック図である。
[式2]Y=0.2R+0.7G+0.1B
輝度系画像データ取得部105により取得された輝度系画像データは、点像復元処理実行部110へ送られる。
g(x,y)=h(x,y)*f(x,y)
ただし、*はコンボリューションを示す。
g(x,y)*R(x,y)=f(x,y)
このR(x,y)を復元フィルタという。復元フィルタは、原画像と復元画像との2乗平均誤差を最小にする最小2乗フィルタ(ウィナーフィルタ)、制限付き逆畳み込みフィルタ、再帰フィルタ、準同形フィルタ等を利用することができる。尚、本発明において復元フィルタは、点像復元処理実行部110に単数又は複数格納されている。
図12には、点像復元処理制御部120が行う点像復元処理実行部110の制御の例を示した図である。すなわち、A-1~A-4は、制御情報として点像復元処理を行うとした場合の、輝度系画像データに対して点像復元処理実行例である。又、B-1~B-4は、制御情報として点像復元処理を行わないとした場合の例である。尚、50、52、54、56、68、及び78は点像復元処理が施された部分を示す。58、60、62、64、及び66は点像復元処理が行われない部分を示す。70、72、74、及び76は、50、52、54、56、68、及び78で実行された点像復元処理よりも弱い点像復元処理が実行された部分(領域)を示す。80は、点像復元処理の強さを段階的に変化させて実行した場合を示す。80においては、色の濃さに応じて、強い点像復元処理が施されていることを示す。
図12のA-1では、情報取得部115により点像復元処理を行うという制御情報が点像復元処理制御部120へ送付され、点像復元処理制御部120は、点像復元処理実行部110に輝度系画像データの全面に対して点像復元処理を行わせている様子が示されている。一方、図12のB-1では、情報取得部115により点像復元処理を行わないという制御情報が点像復元処理制御部120へ送付され、点像復元処理制御部120は、点像復元処理実行部110に輝度系画像データに対する点像復元処理を禁止している様子が示されている。第1点像復元処理制御例では、点像復元処理制御部120は、上述したようにA-1とB-1とのように、制御情報により点像復元処理実行部110の制御を切り換える。
図12のA-2は、情報取得部115により点像復元処理を行うという制御情報が点像復元処理制御部120へ送付され、点像復元処理制御部120は点像復元処理実行部110に輝度系画像データの全面に対して点像復元処理を行わせている様子を示す。一方、図12のB-2は、情報取得部115により色収差による偽信号が強調される部分(偽信号強調領域)は、点像復元処理を行わないという制御情報が点像復元処理制御部120へ送付される。そして、点像復元処理制御部120は、点像復元処理実行部110に色収差による偽信号が強調される部分(偽信号強調領域)は点像復元処理を禁止し(図12B-2、符号60、符号62、符号64、符号66を参照)、その他の部分(偽信号強調領域以外の領域)には点像復元処理を行わせている(図12B-2符号68を参照)。第2点像復元処理制御例では、点像復元処理制御部120は、上述したようにA-2とB-2とのように、制御情報により点像復元処理実行部110の制御を切り換える。
図12のA-3は、情報取得部115により点像復元処理を行うという制御情報が点像復元処理制御部120へ送付され、点像復元処理制御部120は点像復元処理実行部110に輝度系画像データの全面に対して点像復元処理を行わせている様子を示す。一方、図12のB-3では、情報取得部115により色収差による偽信号が強調される部分(偽信号強調領域)は弱い点像復元処理を行うという制御情報が点像復元処理制御部120へ送付される。そして、点像復元処理制御部120は、点像復元処理実行部110に色収差による偽信号が強調される部分(偽信号強調領域)は弱い点像復元処理を行わせて(図12B-3、符号70、符号72、符号74、符号76を参照)、その他の部分(偽信号強調領域以外の領域)には弱い点像復元処理より強い点像復元処理を行わせる(図12B-3、符号78を参照)。第3の点像復元処理制御例では、点像復元処理制御部120は、A-3とB-3とのように制御情報により点像復元処理実行部110の制御を切り換える。
図12のA-4は、情報取得部115により点像復元処理を行うという制御情報が点像復元処理制御部120へ送付され、点像復元処理制御部120は点像復元処理実行部110に輝度系画像データの全面に対して点像復元処理を行わせている様子を示す。一方、図12のB-4では、情報取得部115により色収差による偽信号の強調され度合い(偽信号強調度合い)に応じて点像復元処理の強さを変えるという制御情報が点像復元処理制御部120へ送付される。そして、点像復元処理制御部120は、点像復元処理実行部110に色収差による偽信号の強調され度合いに応じて点像復元処理の強さを変えさせている(図12のB-4)。第4の点像復元処理制御例では、点像復元処理制御部120は、上述したようにA-4とB-4とのように制御情報により点像復元処理実行部110の制御を切り換える。尚、点像復元処理制御部120の制御例は、上述の制御例に限定されるものではなく、色収差による偽信号の発生度合いを考慮して点像復元処理の実行を行うことができる。
図14には、画像処理装置28の第2実施態様を示している。図3における画像処理装置28の第1実施態様と同様の箇所は、同じ符号を付して説明は省略する。画像処理装置28の第2実施態様と画像処理装置28の第1実施態様とを比較すると、画像処理装置28の第2実施態様は、画像分析部111が追加されている点、情報取得部115は分析情報も取得して制御情報を生成する点に関して相違している。
図16には、画像処理装置28の第3実施態様を示している。図3における画像処理装置28の第1実施態様と同様の箇所は、同じ符号を付して説明は省略する。画像処理装置28の第3実施態様と画像処理装置28の第1実施態様とを比較すると、第1実施態様では輝度系画像データに点像復元処理を行っていたが、第3実施形態では輝度系画像データの具体例である輝度信号Yに対して点像復元処理を行う点で相違する。輝度信号Yに対して点像復元処理を行うことにより、的確に点像復元処理を行うことができる。ここで輝度信号Yとは、輝度信号Y及び色差信号Cb、Crで表される色空間においての輝度信号Yである。
図17には、画像処理装置28の第4実施態様を示している。図3における画像処理装置28の第1実施態様と同様の箇所は、同じ符号を付して説明は省略する。画像処理装置28の第4実施態様と画像処理装置28の第1実施態様とを比較すると、第1実施態様では輝度系画像データに点像復元処理を行っていたが、第4実施態様では輝度系画像データの具体例であるG色のデータに点像復元処理を行う点で相違する。G色のデータは、輝度信号Yの値を生成する際に最も寄与する値である(式1、式2参照)ので、G色のデータに対して点像復元処理を行うことにより、より精度の高い点像復元処理を行うことが可能である。
図18には、撮像素子22の変形例を示す図である。特に撮像素子22の受光面上に配置されているカラーフィルタ配列に関して、図2で説明したカラーフィルタ配列の変形例としてのカラーフィルタ配列(「X-trans」(登録商標))を示す。本発明において、撮像素子22には様々なカラーフィルタ配列を採用することが可能であり、図18にはそのうちの一つの変形例を示す。
上述の実施形態における点像復元処理実行部110が行う点像復元処理は、特定の撮影条件(例えば、絞り値、焦点距離、レンズ種類、など)に応じて点拡がり(点像ボケ)を回復修正することで本来の被写体像を復元する画像処理であるが、本発明を適用可能な画像処理は上述の実施形態における点像復元処理に限定されるものではない。例えば、拡大された被写界(焦点)深度(EDoF:Extended Depth of Field(Focus))を有する光学系(撮影レンズ等)によって撮影取得された画像データに対する点像復元処理に対しても、本発明に係る点像復元処理を適用することが可能である。EDoF光学系によって被写界深度(焦点深度)が拡大された状態で撮影取得されるボケ画像の画像データに対して点像復元処理を行うことで、広範囲でピントが合った状態の高解像度の画像データを復元生成することができる。この場合、EDoF光学系の点拡がり関数(PSF、OTF(optical transfer function)、MTF(modulation transfer function, magnitude transfer function)、PTF(phase transfer function)、等)に基づく復元フィルタであって、拡大された被写界深度(焦点深度)の範囲内において良好な画像復元が可能となるように設定されたフィルタ係数を有する復元フィルタを用いた点像復元処理が行われる。
Claims (17)
- 撮像素子から出力されたモザイク画像データに対して、デモザイク処理を行い、デモザイク画像データを生成するデモザイク処理手段と、
前記デモザイク処理手段により得られた前記デモザイク画像データに基づいて、輝度に関する画像データである輝度系画像データを取得する輝度系画像データ取得手段と、
前記輝度系画像データ取得手段により取得された前記輝度系画像データに対して、点像復元処理を行う点像復元処理実行手段と、
被写体の撮影条件に関する撮影情報に基づいて、前記点像復元処理の実行に関する制御情報を取得する情報取得手段と、
前記情報取得手段により取得した前記制御情報に基づいて、前記点像復元処理実行手段の処理動作を制御する点像復元処理制御手段と、
を備える画像処理装置。 - 前記撮影情報は、撮影に使用されたレンズ情報、撮影時の絞り値、撮影時の焦点距離、及び撮影時の被写体距離のうち少なくとも一つを含む請求項1に記載の画像処理装置。
- 前記モザイク画像データ又は前記デモザイク画像データに基づいて、前記点像復元処理を行うと偽信号が強調される前記モザイク画像データか否か、又は前記点像復元処理を行うと偽信号が強調される前記デモザイク画像データか否かを分析する画像分析手段を有し、
前記情報取得手段は、前記撮影情報及び前記画像分析手段により得られる分析情報に基づいて、前記制御情報を取得する請求項1又は2に記載の画像処理装置。 - 前記画像分析手段は、前記モザイク画像データ又は前記デモザイク画像データに基づいてコントラストの大きさを求めることにより、前記点像復元処理を行うと偽信号が強調される前記モザイク画像データか否か、又は前記点像復元処理を行うと偽信号が強調される前記デモザイク画像データか否かを分析する請求項3に記載の画像処理装置。
- 前記点像復元処理制御手段は、前記制御情報により、前記点像復元処理を行うと偽信号が強調されるか否かを判別し、
前記点像復元処理制御手段は、前記偽信号が強調されないと判別した場合には、前記輝度系画像データに対して前記点像復元処理実行手段により前記点像復元処理を行わせ、
前記点像復元処理制御手段は、前記偽信号が強調されると判別した場合には、前記輝度系画像データに対して前記点像復元処理実行手段による前記点像復元処理を禁止する請求項1から4のいずれか1項に記載の画像処理装置。 - 前記点像復元処理制御手段は、前記制御情報により、前記点像復元処理を行うと偽信号が強調されるか否かを判別し、
前記点像復元処理制御手段は、前記偽信号が強調されないと判別した場合には、前記輝度系画像データに対して前記点像復元処理実行手段により前記点像復元処理を行わせ、
前記点像復元処理制御手段は、前記偽信号が強調されると判別した場合には、前記偽信号が強調される領域である偽信号強調領域を特定し、前記点像復元処理実行手段により前記偽信号強調領域に対する前記点像復元処理を禁止し、前記偽信号強調領域以外の領域には前記点像復元処理を行わす請求項1から4のいずれか1項に記載の画像処理装置。 - 前記点像復元処理制御手段は、前記制御情報により、前記点像復元処理を行うと偽信号が強調されるか否かを判別し、
前記点像復元処理制御手段は、前記偽信号が強調されないと判別した場合には、前記輝度系画像データに対して前記点像復元処理実行手段により前記点像復元処理を行わせ、
前記点像復元処理制御手段は、前記偽信号は強調されると判別した場合には、前記偽信号が強調される領域である偽信号強調領域を特定し、前記偽信号強調領域以外の領域には前記点像復元処理実行手段により前記点像復元処理を行わせ、前記偽信号強調領域には前記点像復元処理実行手段により前記点像復元処理に代えて前記点像復元処理よりも効果の弱い点像復元処理を行わせる請求項1から4のいずれか1項に記載の画像処理装置。 - 前記点像復元処理制御手段は、前記制御情報により、前記点像復元処理を行うと偽信号が強調されるか否かを判別し、
前記点像復元処理制御手段が、前記偽信号は強調されないと判別した場合には、前記輝度系画像データに対して、前記点像復元処理実行手段により前記点像復元処理を行わせ、
前記点像復元処理制御手段が、前記偽信号は強調されると判別した場合には、前記偽信号の強調され度合いである偽信号強調度合いを特定し、前記点像復元処理実行手段により、前記偽信号強調度合いに応じて前記点像復元処理の強さを変えて前記点像復元処理を行わせる請求項1から4のいずれか1項に記載の画像処理装置。 - 前記点像復元処理制御手段により特定される前記偽信号強調領域は、像高が高い領域である請求項6又は7に記載の画像処理装置。
- 前記点像復元処理制御手段により特定される前記偽信号強調度合いは、像高の高さに応じて大きくなる請求項8に記載の画像処理装置。
- 前記輝度系画像データは、輝度信号を得るための寄与率が最も高い前記デモザイク画像データ内の色データ、又は前記デモザイク画像データに基づいて得られる輝度信号である請求項1から10のいずれか1項に記載の画像処理装置。
- 前記輝度系画像データは、輝度信号Yおよび色差信号Cb、Crで表される色空間においての輝度信号Yの値である請求項1から11のいずれか1項に記載の画像処理装置。
- 前記デモザイク処理手段は、輝度信号を得るための寄与率が最も高い前記モザイク画像データ内の色データに基づいて、前記モザイク画像データを構成する複数の画素信号における相関方向を判別する処理を含む請求項1から12のいずれか1項に記載の画像処理装置。
- 前記デモザイク処理手段は、輝度信号を得るための寄与率が最も高い前記モザイク画像データ内の色データに基づいて、前記モザイク画像データにおけるエッジ検出処理を含む請求項1から13のいずれか1項に記載の画像処理装置。
- 請求項1から14のいずれか1項に記載の画像処理装置を含む撮像装置。
- 撮像素子から出力されたモザイク画像データに対して、デモザイク処理を行い、デモザイク画像データを生成するデモザイク処理ステップと、
前記デモザイク処理ステップにより生成された前記デモザイク画像データに基づいて、輝度に関する輝度系画像データを取得する輝度系画像データ取得ステップと、
前記輝度系画像データ取得ステップにより取得された前記輝度系画像データに対して、点像復元処理を行う点像復元処理実行ステップと、
被写体の撮影条件に関する撮影情報に基づいて、前記点像復元処理の実行に関する制御情報を取得する情報取得ステップと、
前記情報取得ステップにより得られる前記制御情報に基づいて、前記点像復元処理実行ステップの処理動作を制御する点像復元処理制御ステップと、
を含む画像処理方法。 - 撮像素子から出力されたモザイク画像データに対して、デモザイク処理を行い、デモザイク画像データを生成するデモザイク処理ステップと、
前記デモザイク処理ステップにより生成された前記デモザイク画像データに基づいて、輝度に関する輝度系画像データを取得する輝度系画像データ取得ステップと、
前記輝度系画像データ取得ステップにより取得された前記輝度系画像データに対して、点像復元処理を行う点像復元処理実行ステップと、
被写体の撮影条件に関する撮影情報に基づいて、前記点像復元処理の実行に関する制御情報を取得する情報取得ステップと、
前記情報取得ステップにより得られる前記制御情報に基づいて、前記点像復元処理実行ステップの処理動作を制御する点像復元処理制御ステップと、
をコンピュータに実行させる為のプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112013006596.7T DE112013006596B4 (de) | 2013-02-05 | 2013-04-26 | Bildverarbeitungsvorrichtung, Bildaufnahmevorrichtung, Bildverarbeitungsverfahren und Programm |
CN201380072322.0A CN104969545B (zh) | 2013-02-05 | 2013-04-26 | 图像处理装置、摄像装置、图像处理方法以及程序 |
JP2014560627A JP5830186B2 (ja) | 2013-02-05 | 2013-04-26 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
US14/810,168 US9432643B2 (en) | 2013-02-05 | 2015-07-27 | Image processing device, image capture device, image processing method, and non-transitory computer-readable medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013020490 | 2013-02-05 | ||
JP2013-020490 | 2013-02-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/810,168 Continuation US9432643B2 (en) | 2013-02-05 | 2015-07-27 | Image processing device, image capture device, image processing method, and non-transitory computer-readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014122804A1 true WO2014122804A1 (ja) | 2014-08-14 |
Family
ID=51299415
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/062466 WO2014122804A1 (ja) | 2013-02-05 | 2013-04-26 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9432643B2 (ja) |
JP (1) | JP5830186B2 (ja) |
CN (1) | CN104969545B (ja) |
DE (1) | DE112013006596B4 (ja) |
WO (1) | WO2014122804A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104581100A (zh) * | 2015-02-12 | 2015-04-29 | 张李静 | 色彩滤镜阵列和图像处理方法 |
KR20160024345A (ko) * | 2014-08-22 | 2016-03-04 | 에스케이하이닉스 주식회사 | 이미지 센서 및 이를 구비하는 전자장치 |
US10115757B2 (en) | 2014-08-22 | 2018-10-30 | SK Hynix Inc. | Image sensor and electronic device having the same |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9674431B2 (en) | 2013-02-01 | 2017-06-06 | Canon Kabushiki Kaisha | Image pickup apparatus, image processing apparatus, image processing method, and non-transitory computer-readable storage medium |
US9424629B2 (en) * | 2013-02-01 | 2016-08-23 | Canon Kabushiki Kaisha | Image pickup apparatus, image processing apparatus, image processing method, and non-transitory computer-readable storage medium |
US9870600B2 (en) * | 2015-01-06 | 2018-01-16 | The Regents Of The University Of California | Raw sensor image and video de-hazing and atmospheric light analysis methods and systems |
JP6663201B2 (ja) * | 2015-10-27 | 2020-03-11 | キヤノン株式会社 | 画像符号化装置及びその制御方法、並びに、コンピュータプログラム及び記憶媒体 |
US9866809B2 (en) * | 2015-11-19 | 2018-01-09 | Sony Corporation | Image processing system with aliasing detection mechanism and method of operation thereof |
US10429271B2 (en) * | 2016-07-01 | 2019-10-01 | Microsoft Technology Licensing, Llc | Camera testing using reverse projection |
CN106604001B (zh) * | 2016-11-29 | 2018-06-29 | 广东欧珀移动通信有限公司 | 图像处理方法、图像处理装置、成像装置及电子装置 |
CN106507068B (zh) | 2016-11-29 | 2018-05-04 | 广东欧珀移动通信有限公司 | 图像处理方法及装置、控制方法及装置、成像及电子装置 |
JP6832224B2 (ja) * | 2017-04-28 | 2021-02-24 | 株式会社デンソーテン | 付着物検出装置および付着物検出方法 |
CN111824164B (zh) * | 2019-04-11 | 2021-10-22 | 中能道通物流科技有限责任公司 | 周围信息采集显示方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001197356A (ja) * | 2000-01-13 | 2001-07-19 | Minolta Co Ltd | 画像復元装置および画像復元方法 |
JP2008042874A (ja) * | 2006-07-14 | 2008-02-21 | Eastman Kodak Co | 画像処理装置、画像復元方法およびプログラム |
JP2011193276A (ja) * | 2010-03-15 | 2011-09-29 | Canon Inc | 撮像装置、その制御方法及びプログラム |
JP2012005056A (ja) * | 2010-06-21 | 2012-01-05 | Canon Inc | 画像処理装置、画像処理方法及びプログラム |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7253836B1 (en) * | 1998-06-30 | 2007-08-07 | Nikon Corporation | Digital camera, storage medium for image signal processing, carrier wave and electronic camera |
US20010008418A1 (en) | 2000-01-13 | 2001-07-19 | Minolta Co., Ltd. | Image processing apparatus and method |
US20020167602A1 (en) * | 2001-03-20 | 2002-11-14 | Truong-Thao Nguyen | System and method for asymmetrically demosaicing raw data images using color discontinuity equalization |
JP2002300461A (ja) | 2001-03-30 | 2002-10-11 | Minolta Co Ltd | 画像復元装置、画像復元方法、プログラム及び記録媒体 |
US6978050B2 (en) * | 2001-07-18 | 2005-12-20 | Hewlett-Packard Development Company, L.P. | Electronic image color plane reconstruction |
DE60141901D1 (de) * | 2001-08-31 | 2010-06-02 | St Microelectronics Srl | Störschutzfilter für Bayermusterbilddaten |
EP1641283B1 (en) * | 2003-06-12 | 2019-01-09 | Nikon Corporation | Image processing method, image processing program, image processor |
JP2005354610A (ja) * | 2004-06-14 | 2005-12-22 | Canon Inc | 画像処理装置、画像処理方法および画像処理プログラム |
US7525583B2 (en) * | 2005-02-11 | 2009-04-28 | Hewlett-Packard Development Company, L.P. | Decreasing aliasing in electronic images |
TWI260918B (en) * | 2005-05-30 | 2006-08-21 | Pixart Imaging Inc | Pixel compensation method for an image sensor |
US20070133902A1 (en) * | 2005-12-13 | 2007-06-14 | Portalplayer, Inc. | Method and circuit for integrated de-mosaicing and downscaling preferably with edge adaptive interpolation and color correlation to reduce aliasing artifacts |
JP4501855B2 (ja) * | 2005-12-22 | 2010-07-14 | ソニー株式会社 | 画像信号処理装置、撮像装置、および画像信号処理方法、並びにコンピュータ・プログラム |
US8036481B2 (en) | 2006-07-14 | 2011-10-11 | Eastman Kodak Company | Image processing apparatus and image restoration method and program |
US7907791B2 (en) * | 2006-11-27 | 2011-03-15 | Tessera International, Inc. | Processing of mosaic images |
CN101971632B (zh) * | 2008-01-08 | 2013-10-16 | 艾利森电话股份有限公司 | 自适应滤波 |
JP4986965B2 (ja) | 2008-09-30 | 2012-07-25 | キヤノン株式会社 | 画像処理方法、画像処理装置、画像処理プログラム及び撮像装置 |
JP2011135563A (ja) * | 2009-11-30 | 2011-07-07 | Canon Inc | 撮像装置および画像処理方法 |
JP5441652B2 (ja) | 2009-12-09 | 2014-03-12 | キヤノン株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP5546229B2 (ja) * | 2009-12-09 | 2014-07-09 | キヤノン株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP5363966B2 (ja) * | 2009-12-18 | 2013-12-11 | 富士フイルム株式会社 | 撮像装置 |
WO2011122283A1 (ja) | 2010-03-31 | 2011-10-06 | キヤノン株式会社 | 画像処理装置、およびそれを用いた撮像装置 |
CN102907082B (zh) * | 2010-05-21 | 2016-05-18 | 松下电器(美国)知识产权公司 | 摄像装置、图像处理装置、图像处理方法 |
JP5693089B2 (ja) * | 2010-08-20 | 2015-04-01 | キヤノン株式会社 | 画像処理装置、及び画像処理装置の制御方法 |
JP5660711B2 (ja) * | 2010-09-16 | 2015-01-28 | 富士フイルム株式会社 | 復元ゲインデータ生成方法 |
RU2551649C2 (ru) * | 2011-02-28 | 2015-05-27 | Фуджифилм Корпорэйшн | Устройство формирования цветного изображения |
EP2683167B1 (en) * | 2011-02-28 | 2018-05-02 | Fujifilm Corporation | Color imaging device |
JP5868076B2 (ja) * | 2011-08-31 | 2016-02-24 | キヤノン株式会社 | 画像処理装置及び画像処理方法 |
JP5818586B2 (ja) * | 2011-08-31 | 2015-11-18 | キヤノン株式会社 | 画像処理装置及び画像処理方法 |
JP5904281B2 (ja) * | 2012-08-10 | 2016-04-13 | 株式会社ニコン | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP2014123173A (ja) * | 2012-12-20 | 2014-07-03 | Sony Corp | 画像処理装置、撮像装置及び画像処理方法 |
CN104580879B (zh) * | 2013-10-09 | 2018-01-12 | 佳能株式会社 | 图像处理设备、图像拾取设备以及图像处理方法 |
JP6327922B2 (ja) * | 2014-04-25 | 2018-05-23 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
JP6071966B2 (ja) * | 2014-09-17 | 2017-02-01 | キヤノン株式会社 | 画像処理方法およびそれを用いた撮像装置、画像処理装置、画像処理プログラム |
JP6071974B2 (ja) * | 2014-10-21 | 2017-02-01 | キヤノン株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
-
2013
- 2013-04-26 CN CN201380072322.0A patent/CN104969545B/zh active Active
- 2013-04-26 WO PCT/JP2013/062466 patent/WO2014122804A1/ja active Application Filing
- 2013-04-26 DE DE112013006596.7T patent/DE112013006596B4/de not_active Expired - Fee Related
- 2013-04-26 JP JP2014560627A patent/JP5830186B2/ja active Active
-
2015
- 2015-07-27 US US14/810,168 patent/US9432643B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001197356A (ja) * | 2000-01-13 | 2001-07-19 | Minolta Co Ltd | 画像復元装置および画像復元方法 |
JP2008042874A (ja) * | 2006-07-14 | 2008-02-21 | Eastman Kodak Co | 画像処理装置、画像復元方法およびプログラム |
JP2011193276A (ja) * | 2010-03-15 | 2011-09-29 | Canon Inc | 撮像装置、その制御方法及びプログラム |
JP2012005056A (ja) * | 2010-06-21 | 2012-01-05 | Canon Inc | 画像処理装置、画像処理方法及びプログラム |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160024345A (ko) * | 2014-08-22 | 2016-03-04 | 에스케이하이닉스 주식회사 | 이미지 센서 및 이를 구비하는 전자장치 |
US10115757B2 (en) | 2014-08-22 | 2018-10-30 | SK Hynix Inc. | Image sensor and electronic device having the same |
TWI697111B (zh) * | 2014-08-22 | 2020-06-21 | 韓商愛思開海力士有限公司 | 影像感測器及具有該影像感測器的電子裝置 |
KR102394277B1 (ko) * | 2014-08-22 | 2022-05-06 | 에스케이하이닉스 주식회사 | 이미지 센서 및 이를 구비하는 전자장치 |
CN104581100A (zh) * | 2015-02-12 | 2015-04-29 | 张李静 | 色彩滤镜阵列和图像处理方法 |
Also Published As
Publication number | Publication date |
---|---|
DE112013006596T5 (de) | 2015-12-03 |
US9432643B2 (en) | 2016-08-30 |
JPWO2014122804A1 (ja) | 2017-01-26 |
DE112013006596B4 (de) | 2019-10-02 |
CN104969545A (zh) | 2015-10-07 |
CN104969545B (zh) | 2018-03-20 |
JP5830186B2 (ja) | 2015-12-09 |
US20150334359A1 (en) | 2015-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5830186B2 (ja) | 画像処理装置、撮像装置、画像処理方法及びプログラム | |
US9633417B2 (en) | Image processing device and image capture device performing restoration processing using a restoration filter based on a point spread function | |
JP5833794B2 (ja) | 撮像装置 | |
JP5759085B2 (ja) | 画像処理装置、撮像装置、画像処理方法及びプログラム並びに記録媒体 | |
JP5851650B2 (ja) | 復元フィルタ生成装置及び方法、画像処理装置、撮像装置、復元フィルタ生成プログラム並びに記録媒体 | |
JP5752866B2 (ja) | 画像処理装置、撮像装置、画像処理方法及びプログラム並びに記録媒体 | |
US9699427B2 (en) | Imaging device, imaging method, and image processing device | |
US9881362B2 (en) | Image processing device, image-capturing device, image processing method, and program | |
US9799105B2 (en) | Image processing device, imaging device, image processing method, and program for restoration processing based on a point spread function and a frame after a frame to be processed | |
US20160150161A1 (en) | Image processing device, image capture device, image processing method, and program | |
JPWO2015015935A1 (ja) | 撮像装置及び画像処理方法 | |
JP6042034B2 (ja) | 画像処理装置、撮像装置、画像処理方法及びプログラム | |
JP6342083B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
US9584801B2 (en) | Image pickup device, calibration system, calibration method, and program | |
WO2014136321A1 (ja) | 復元フィルタ生成装置及び方法、画像処理装置及び方法、撮像装置、プログラム並びに記録媒体 | |
WO2014050191A1 (ja) | 画像処理装置、撮像装置、画像処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13874518 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014560627 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112013006596 Country of ref document: DE Ref document number: 1120130065967 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13874518 Country of ref document: EP Kind code of ref document: A1 |