WO2013118364A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2013118364A1 WO2013118364A1 PCT/JP2012/079622 JP2012079622W WO2013118364A1 WO 2013118364 A1 WO2013118364 A1 WO 2013118364A1 JP 2012079622 W JP2012079622 W JP 2012079622W WO 2013118364 A1 WO2013118364 A1 WO 2013118364A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image processing
- image
- unit
- images
- layer
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 358
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000003384 imaging method Methods 0.000 claims description 63
- 238000012937 correction Methods 0.000 claims description 14
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 230000015572 biosynthetic process Effects 0.000 claims description 7
- 230000007547 defect Effects 0.000 claims description 7
- 238000003786 synthesis reaction Methods 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 abstract description 3
- 238000000034 method Methods 0.000 description 19
- 230000008569 process Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 10
- 238000003860 storage Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008570 general process Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/68—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects
Definitions
- the present disclosure relates to an image processing apparatus and an image processing method.
- the number of pixels of a captured image obtained by an imaging device tends to increase and the pixel size tends to decrease. While the details of the subject can be expressed by increasing the number of pixels, the image quality deterioration becomes conspicuous because noise increases when the pixel size is small. Such image quality degradation can be effectively corrected by noise removal by a spatial filter. However, when the number of pixels of the image increases, the number of pixels processed by the spatial filter also increases, so that the hardware scale and cost for realizing the spatial filter increase.
- the present disclosure proposes a new and improved image processing apparatus and image processing method capable of obtaining a high-definition image while suppressing the processing load.
- an image processing unit that performs image processing on each of a plurality of images with different resolutions for the same subject, and an image that combines image processing results by the image processing unit for each of the plurality of images
- An image processing apparatus including a combining unit is provided.
- an image including performing image processing on each of a plurality of images having different resolutions for the same subject and combining the image processing results for each of the plurality of images.
- a processing method is provided.
- a high-definition image can be obtained while suppressing the processing load.
- FIG. 2 is a block diagram illustrating a configuration of a hierarchical image processing unit of the image processing apparatus according to the first embodiment. It is explanatory drawing which showed the specific example of the image process in each hierarchy. It is explanatory drawing which showed the specific example of the image process in each hierarchy. It is explanatory drawing which showed the specific example of the image process in each hierarchy. It is explanatory drawing which showed the specific example of the process timing in each hierarchy. It is explanatory drawing which showed the specific example of the process timing in each hierarchy. It is explanatory drawing which showed the specific example of the process timing in each hierarchy. It is explanatory drawing which showed the specific example of the process timing in each hierarchy. 3 is a flowchart illustrating an operation of the image processing apparatus according to the first embodiment.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
- the image processing apparatus 10 includes: A. An image processing unit (212, 222,... 232) that performs image processing on each of a plurality of images with different resolutions for the same subject; B. An image synthesis unit (214, 224) that synthesizes image processing results by the image processing unit for each of the plurality of images.
- FIG. 1 is an explanatory diagram illustrating a configuration of an image processing apparatus 10 according to an embodiment of the present disclosure.
- the image processing apparatus 10 includes an imaging unit 12, a storage unit 14, a hierarchical image processing unit 20, and a display unit 30.
- the imaging unit 12 converts light emitted from the subject into an electrical image signal.
- the imaging unit 12 controls an imaging optical system such as a photographing lens and a zoom lens that collects light, an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and exposure.
- an imaging optical system such as a photographing lens and a zoom lens that collects light
- an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor)
- Exposure physical shutter and electronic shutter
- the storage unit 14 stores a program for operating the image processing apparatus 10, an image obtained by the imaging unit 12, an image obtained by the hierarchical image processing unit 20, and the like.
- Such a storage unit 14 may be a storage medium such as a non-volatile memory, a magnetic disk, an optical disk, and an MO (Magneto Optical) disk.
- the non-volatile memory include flash memory, SD card, micro SD card, USB memory, EEPROM (Electrically Erasable Programmable Read-Only Memory), and EPROM (Erasable Programmable ROM).
- the magnetic disk include a hard disk and a disk type magnetic disk.
- Examples of the optical disc include a CD (Compact Disc), a DVD (Digital Versatile Disc), and a BD (Blu-Ray Disc (registered trademark)).
- the hierarchical image processing unit 20 performs image processing on an image input from the imaging unit 12 or the storage unit 14. As will be described in detail later, the hierarchical image processing unit 20 performs image processing on each of a plurality of images with different resolutions for the same subject, and synthesizes the image processing results for each of the plurality of images. With this configuration, it is possible to reduce the hardware scale and cost compared to the case where image processing is performed only on an image having a high resolution, and image processing is performed only on an image having a low resolution. It is possible to obtain a high-definition image as compared with the case where it is performed.
- the display unit 30 displays an image obtained by the image processing of the hierarchical image processing unit 20, for example.
- a display unit 30 may be a liquid crystal display (LCD) device or an OLED (Organic Light Emitting Diode) device.
- the image processing apparatus 10 includes the imaging unit 12 as the imaging function. However, the image processing apparatus 10 may not include the imaging function.
- the image processing apparatus 10 is an information processing apparatus such as a PC (Personal Computer), a home video processing apparatus (DVD recorder, VCR, etc.), a PDA (Personal Digital Assistant), a home game machine, and a home appliance. Also good.
- the image processing device 10 is a portable information processing device such as an imaging device, a smartphone, a mobile phone, a PHS (Personal Handyphone System), a portable music playback device, a portable video processing device, or a portable game device. Also good.
- processing with a reduced image is effective.
- the image processing apparatus 10 according to each embodiment has been created by focusing on the above circumstances.
- the image processing apparatus 10 according to each embodiment can reduce the hardware scale and cost as compared with a case where image processing is performed only on an image having a high resolution, and an image having a low resolution. It is possible to obtain a high-definition image as compared with the case where image processing is performed only for the image processing.
- the image processing apparatus 10 according to each embodiment will be described in detail sequentially.
- FIG. 2 is a block diagram showing the configuration of the hierarchical image processing unit 20-1 of the image processing apparatus 10 according to the first embodiment.
- the hierarchical image processing unit 20-1 includes the image reduction units 202 and 203, the first hierarchical processing unit 210 to the Kth hierarchical processing unit 230, and the processing result transmission units 228 and 238. Prepare.
- the image reduction unit 202 reduces the number of pixels of the input original image, and generates a reduced image of the second layer having the next largest number of pixels after the original image. Note that the image reduction unit 202 may reduce the number of pixels of the original image by thinning out the pixels, or may reduce the number of pixels of the original image by averaging a plurality of pixels.
- the image reduction unit 203 reduces the number of pixels of the input reduced image of the (K-1) th layer and generates a reduced image of the Kth layer having the smallest number of pixels.
- one or more image reduction units may be provided between the image reduction unit 202 and the image reduction unit 203. Each image reduction unit may reduce the input image at the same reduction rate. With this configuration, each image reduction unit can be realized by the same hardware, and thus manufacturing costs can be suppressed.
- the processing result transmission unit 228 transmits the parameter information obtained by the second layer processing unit 220 to the upper first layer processing unit 210 that uses the parameter information in image processing.
- the parameter information is a histogram indicating the luminance distribution obtained by the overall luminance calculation, and gradation conversion may be performed using the histogram in the upper layer.
- the processing result transmission unit 238 transmits the parameter information obtained by the K-th layer processing unit 230 to the upper K-1th layer processing unit that uses the parameter information during image processing.
- a process result transmission part may be provided between each hierarchy process part.
- the Kth hierarchical processing unit 230 includes an image processing unit 232 and an upscale unit 236.
- the image processing unit 232 performs image processing on the reduced image of the Kth hierarchy input from the image reducing unit 203.
- the upscale unit 236 upscales (enlarges) the image resulting from the processing by the image processing unit 232 to the (K-1) th layer. Note that upscaling may be inserting a pixel having the same pixel value as an adjacent pixel between pixels, or inserting a pixel having a pixel value of a weighted average result of neighboring pixels. Good.
- the second layer processing unit 220 includes an image processing unit 222, an image composition unit 224, and an upscale unit 226.
- the image processing unit 222 performs image processing on the reduced image of the second hierarchy input from the image reducing unit 202.
- the image synthesis unit 224 synthesizes the processing result image from the image processing unit 222 and the processing result image in the third hierarchy input from the third hierarchy processing unit.
- the image composition may be a weighted average of each pixel constituting two images.
- the first layer processing unit 210 includes an image processing unit 212 and an image composition unit 214.
- the image processing unit 212 performs image processing on the input original image.
- the image composition unit 214 synthesizes the processing result image from the image processing unit 212 and the processing result image in the third layer input from the second layer processing unit 220. Then, the image synthesis unit 214 outputs the synthesized result image as an output image to the display unit 30 or the storage unit 14.
- FIG. 2 although the example which upscales the image processing result in each hierarchy one layer was demonstrated, this embodiment is not limited to this example. For example, it is possible to directly upscale the image processing results of each layer to the first layer (same resolution as the original image). However, when the image processing result is upscaled one layer at a time as described with reference to FIG. 2, the same upscale unit (226, 236) can be arranged in each layer, thus simplifying the manufacturing process and manufacturing. Cost reduction can be realized.
- the hierarchical image processing unit 20-1 performs image processing in a plurality of layers having different resolutions, and synthesizes the image processing results in the respective layers. Furthermore, the hierarchical image processing unit 20-1 according to the present embodiment can perform various image processing in each hierarchy. Hereinafter, a specific example of image processing performed in each layer will be described.
- edge preserving NR noise reduction
- NLmeansNR are examples of luminance NR
- chroma NR is color noise.
- Correction processing defect correction is processing for correcting defective pixels
- tone conversion is processing for adjusting the relationship between input and output
- flicker detection is processing for detecting flicker in a frame
- flicker correction is processing for detecting detected flicker.
- the camera shake correction is a process for correcting the influence of camera shake during imaging
- the overall brightness calculation is a brightness distribution calculation process
- the GMV (Global Motion Vector) detection is a motion vector for each frame. It is a detection process.
- the image processing units (212, 222,... 232) of each layer may perform the same image processing.
- the image processing units (212, 222,... 232) of each layer may perform the same edge preservation type NR as shown in FIG.
- the image processing units (212, 222,... 232) in each layer may perform different NRs. More specifically, the image processing units (212, 222,... 232) of each layer may perform NR suitable for the resolution of each layer. For example, luminance noise such as screen roughness is dominated by high frequency components, and color noise is dominated by low frequency components.
- luminance NR such as edge-preserving NR or NLmeansNR may be performed on the upper layer side
- chroma NR may be performed on the lower layer side. That is, as shown in FIG. 4, the image processing unit 212 in the first layer performs edge preserving NR, the image processing unit 222 in the second layer performs NLmeansNR, and the image processing unit 232 in the Kth layer performs chroma NR. You may go. With this configuration, it is possible to reduce the processing load for performing the chroma NR while appropriately correcting the luminance noise and the color noise.
- the image processing units (212, 222, ... 232) of each layer may perform different image processing. More specifically, the image processing units (212, 222,... 232) of each layer may perform image processing suitable for the resolution of each layer. For example, global brightness calculation, flicker detection, GMV detection, chroma NR, gradation conversion, flicker correction, camera shake correction, and the like can be executed at a low resolution, but defect correction is difficult to execute properly unless the resolution is high.
- the image processing unit 212 in the first layer performs defect correction
- the image processing unit 222 in the second layer performs luminance NR
- the image processing unit 232 in the Kth layer calculates global luminance.
- Flicker detection, GMV detection, chroma NR, gradation conversion, flicker correction, and camera shake correction may be performed.
- the image processing units (212, 222,... 232) of each layer can perform image processing of various patterns.
- the timing at which the image processing units (212, 222,... 232) of each layer perform image processing is not particularly limited.
- an example of the timing at which the image processing units (212, 222,... 232) of each layer perform image processing will be described.
- FIG. 6 is an explanatory diagram showing a first timing example of image processing in each layer.
- the hierarchical image processing unit 20-1 may start image processing in order from the upper hierarchy. That is, first, the image processing unit 212 in the first layer starts image processing (Processing 1 and Processing 2), and then the image processing unit 222 in the second layer starts image processing (Processing A and Processing B), Thereafter, the image processing unit 232 in the third hierarchy may start image processing (processing A, processing B, and processing C).
- FIG. 6 shows an example in which the end timing of image processing for each layer is aligned, the image processing for each layer may end at a different timing.
- FIG. 7 is an explanatory diagram showing a second timing example of image processing in each layer.
- the hierarchical image processing unit 20-1 may start image processing for each layer at the same time. That is, the image processing unit 212 in the first layer starts image processing (Processing 1 and Processing 2), and at the same time, the image processing unit 222 in the second layer starts image processing (Processing A, Processing B and Processing C), Thereafter, the image processing unit 232 in the third hierarchy may start image processing (processing A, processing B, processing C, processing D, and processing E).
- FIG. 7 shows an example in which the end timing of image processing for each layer is aligned, the image processing for each layer may end at a different timing.
- FIG. 8 is an explanatory diagram showing a third timing example of image processing in each layer.
- the hierarchy image processing unit 20-1 may start image processing in order from the lower hierarchy. That is, first, the image processing unit 232 in the third hierarchy starts image processing (Process A, Process B, Process C, Process D, and Process E), and then the image processor 222 in the second hierarchy performs image processing (Processing).
- the image processing unit 212 in the first hierarchy may start image processing (processing 1).
- FIG. 8 shows an example in which the end timing of image processing for each layer is aligned, the image processing for each layer may end at a different timing.
- FIG. 9 is a flowchart showing the operation of the image processing apparatus 10 according to the first embodiment.
- the image reducing units 202 and 203 generate reduced images having different resolutions in the K layer ( S320).
- the image processing unit of each layer performs image processing corresponding to the resolution on the input reduced image (S330). After that, the upscaling unit of each layer upscales the image processing result to a higher resolution, and the image combining unit of each layer combines the image processing result in the same layer and the image processing result upscaled in the lower layer. (S340), and the synthesis result is output (S350).
- the parameter information for image processing in the upper layer is obtained by the image processing unit in each layer, the parameter information is transmitted to the upper layer by the processing result transmission unit (228,..., 238). .
- Second Embodiment >> Heretofore, the first embodiment of the present disclosure has been described. Subsequently, a second embodiment of the present disclosure will be described. In the first embodiment, an example in which a plurality of images with different resolutions are acquired by reducing the original image has been described. However, in the second embodiment, a plurality of images with different resolutions at the stage of imaging by the imaging unit 12. Can be obtained. Details will be described below.
- FIG. 10 is a block diagram showing a configuration of the hierarchical image processing unit 20-2 of the image processing apparatus 10 according to the second embodiment.
- the hierarchical image processing unit 20-2 includes a first hierarchical processing unit 210 to a Kth hierarchical processing unit 230, and processing result transmission units 228 and 238.
- the processing result transmission units 228 and 238 are arranged between two consecutive hierarchies, and the parameter information obtained by the lower-level image processing unit is used as the upper-side image. Transmit to the processing unit.
- the first layer processing unit 210 includes an image processing unit 212 and an image composition unit 214.
- the processing contents of the image processing unit 212 and the image synthesizing unit 214 are as described in the first embodiment, but the first embodiment is that an image acquired by the imaging unit 12 is input to the image processing unit 212. Different from form. Specifically, the imaging unit 12 acquires a plurality of hierarchical images by imaging at different exposure timings and different resolutions, and the first hierarchical image with the highest resolution is stored in the first hierarchical image processing unit 212. Entered.
- the second layer processing unit 220 includes an image processing unit 222, an image composition unit 224, and an upscale unit 226. Similar to the first layer, the second layer image having the second highest resolution is input to the image processing unit 222 in the second layer.
- the Kth hierarchical processing unit 230 includes an image processing unit 232 and an upscale unit 236.
- the K-th layer image having the lowest resolution is input to the K-th layer image processing unit 232.
- FIG. 11 is an explanatory diagram showing an example of the exposure timing and the processing timing of each layer in the imaging unit 12.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging device between t1 and t3, and outputs a signal of the 1/4 vertical line between t2 and t3 in the third layer image. Output as.
- the image processing unit 232 in the third hierarchy starts image processing at t3 when an image in the third hierarchy is obtained.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging element between t2 and t5, and outputs a signal of 1/2 vertical line as an image of the second layer between t3 and t5. Then, the image processing unit 222 in the second hierarchy starts image processing at t5 when an image in the second hierarchy is obtained.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging element between t4 and t9, and outputs signals of all the vertical lines as images of the first layer between t5 and t9. Then, the image processor 212 of the first layer starts image processing at t9 when an image of the first layer is obtained.
- FIG. 11 shows an example in which the imaging unit 12 performs a rolling shutter operation in which vertical lines are sequentially exposed
- the imaging unit 12 may perform a global shutter operation in which all vertical lines are exposed simultaneously.
- the imaging unit 12 performs exposure for acquiring a third layer image from t1 to t2, performs exposure for acquiring a second layer image from t2 to t3, and performs first exposure from t4 to t5.
- the same processing timing as that in FIG. 11 can be realized by performing exposure for acquiring a hierarchical image.
- FIG. 12 is an explanatory diagram showing another example of the exposure timing and the processing timing of each layer in the imaging unit 12.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging device between t1 and t6, and outputs signals of all the vertical lines as images of the first layer between t2 and t6. To do. Then, the image processing unit 212 of the first layer starts image processing at t6 when an image of the first layer is obtained.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging element between t5 and t8, and outputs a signal of 1/2 vertical line as an image of the second layer between t6 and t8. Then, the image processor 222 of the second hierarchy starts image processing at t8 when an image of the second hierarchy is obtained.
- the imaging unit 12 sequentially exposes the vertical lines of the imaging device between t7 and t9, and outputs a signal of the 1/4 vertical line as an image of the third layer between t8 and t9. Then, the image processing unit 232 of the third hierarchy starts image processing at t9 when an image of the third hierarchy is obtained.
- FIG. 12 shows an example in which the imaging unit 12 performs a rolling shutter operation in which vertical lines are sequentially exposed
- the imaging unit 12 may perform a global shutter operation in which all vertical lines are exposed simultaneously.
- the imaging unit 12 performs exposure for acquiring a first layer image from t1 to t2, performs exposure for acquiring a second layer image from t5 to t6, and performs third exposure from t7 to t8.
- the same processing timing as that in FIG. 12 can be realized by performing exposure for acquiring a hierarchical image.
- FIG. 13 is a flowchart showing the operation of the image processing apparatus 10 according to the second embodiment.
- the imaging unit 12 acquires a plurality of hierarchical images by imaging at different exposure timings and different resolutions (S420).
- the first hierarchical image acquired by the imaging unit 12 is input to the first hierarchical processing unit 210
- the second hierarchical image is input to the second hierarchical processing unit 220
- the Kth hierarchical image is input to the Kth hierarchical processing unit 230. Is done.
- the image processing unit of each layer performs image processing corresponding to the resolution on the input layer image (S430). After that, the upscaling unit at each layer upscales the image processing result to a higher resolution, and the image composition unit at each layer combines the image processing result at the same layer and the image processing result upscaled at the lower layer. (S440), and the synthesis result is output (S450).
- the parameter information for image processing in the upper layer is obtained by the image processing unit in each layer, the parameter information is transmitted to the upper layer by the processing result transmission unit (228,..., 238). .
- image processing is performed in a plurality of layers having different resolutions, and the image processing results in the respective layers are combined. For this reason, it is possible to reduce the hardware scale and cost compared to the case where image processing is performed only on an image having high resolution, and image processing is performed only on an image having low resolution. Compared to the case, it is possible to obtain a high-definition image.
- the first embodiment of the present disclosure it is possible to acquire a plurality of images having different resolutions by reducing one original image. Further, according to the second embodiment of the present disclosure, it is possible to acquire a plurality of images having different resolutions at the stage of imaging by the imaging unit 12.
- each step in the processing of the image processing apparatus 10 of the present specification does not necessarily have to be processed in time series in the order described as a flowchart.
- each step in the processing of the image processing apparatus 10 may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
- FIGS. 1, 2, and 10 it is possible to create a computer program for causing hardware such as a CPU, ROM, and RAM incorporated in the image processing apparatus 10 to perform the same functions as the components of the image processing apparatus 10 described above.
- a storage medium storing the computer program is also provided.
- at least a part of the blocks shown in the block diagrams of FIGS. 1, 2, and 10 can be configured by hardware.
- An image processing unit that performs image processing on each of a plurality of images having different resolutions for the same subject;
- An image synthesis unit that synthesizes image processing results by the image processing unit for each of the plurality of images;
- An image processing apparatus comprising: (2) The image processing apparatus according to (1), wherein the image processing unit performs the same type of image processing on each of the plurality of images.
- the image processing apparatus further includes an upscaling unit that upscales an image processing result by the image processing unit with respect to an image on a low resolution side to a resolution of an image on a high resolution side
- the image composition unit includes an image processing result by the image processing unit for the high resolution side image, and an image processing result by the image processing unit for the low resolution side image upscaled by the upscaling unit;
- the image processing apparatus according to any one of (1) to (3), wherein: (5)
- the image processing device according to any one of (1) to (4), further including an image conversion unit that performs image conversion to obtain a plurality of images having different resolutions from an input image. .
- the image processing apparatus according to any one of (1) to (4), further including an imaging unit that obtains the plurality of images by imaging at different exposure timings and different resolutions.
- the image processing apparatus according to (6) or (7), wherein the imaging unit acquires the plurality of images in order from an image on a high resolution side.
- the image processing device according to (6) or (7), wherein the imaging unit acquires the plurality of images in order from an image on a low resolution side.
- the image processing unit performs at least luminance noise reduction and chroma noise reduction as image processing, The image processing apparatus according to (3), wherein the luminance noise reduction is performed on an image on a higher resolution side than an image on which the chroma noise reduction is performed.
- the image processing unit further performs defect correction as image processing, The image processing apparatus according to (10), wherein the defect correction is performed on an image on a higher resolution side than an image on which the luminance noise reduction is performed.
- (12) Performing image processing on each of a plurality of images having different resolutions for the same subject; Combining image processing results for each of the plurality of images; Including an image processing method.
Abstract
Description
1.画像処理装置の概略
2.第1の実施形態
2-1.第1の実施形態による画像処理装置の構成
2-2.第1の実施形態による画像処理装置の動作
3.第2の実施形態
4.むすび
本開示による技術は、一例として「2.第1の実施形態」~「3.第2の実施形態」において詳細に説明するように、多様な形態で実施され得る。また、各実施形態による画像処理装置10は、
A.同一被写体についての解像度が異なる複数の画像の各々に対して画像処理を行う画像処理部(212、222、・・・232)と、
B.前記複数の画像の各々についての前記画像処理部による画像処理結果を合成する画像合成部(214、224)と、を備える。
ここで、本開示の実施形態の背景を説明する。近日、イメージセンサ(撮像素子)の多画素化により、撮像装置により得られる撮像画像の画素数が増加し、画素サイズが小さくなる傾向にある。画素数の増加により被写体のより細部を表現できる一方で、画素サイズが小さいとノイズが増えるので画質劣化が目立つようになる。このような画質劣化は、空間フィルタによるノイズ除去により効果的に補正することが可能である。ただし、画像の画素数が増加すると、空間フィルタで処理する画素数も増加するので、空間フィルタを実現するためのハードウェア規模およびコストが増大してしまう。
<2-1.第1の実施形態による画像処理装置の構成>
図2は、第1の実施形態による画像処理装置10の階層画像処理部20-1の構成を示したブロック図である。図2に示したように、階層画像処理部20-1は、画像縮小部202および203と、第1階層処理部210~第K階層処理部230と、処理結果伝送部228および238と、を備える。
-第1の処理例
第1の処理例として、各階層の画像処理部(212、222、・・・232)は、同一の画像処理を行ってもよい。例えば、各階層の画像処理部(212、222、・・・232)は、図3に示したように同一のエッジ保存型NRを行ってもよい。
第2の処理例として、各階層の画像処理部(212、222、・・・232)は、異なるNRを行ってもよい。より詳細には、各階層の画像処理部(212、222、・・・232)は、各階層の解像度に適したNRを行ってもよい。例えば、画面のざらつきのような輝度ノイズは高周波成分が支配的であり、色ノイズは低周波成分が支配的である。
第2の処理例として、各階層の画像処理部(212、222、・・・232)は、異なる画像処理を行ってもよい。より詳細には、各階層の画像処理部(212、222、・・・232)は、各階層の解像度に適した画像処理を行ってもよい。例えば、大局輝度算出、フリッカ検出、GMV検出、クロマNR、階調変換、フリッカ補正、および手ブレ補正などは低解像度で実行できるが、欠陥補正は高解像度でないと適切な実行が困難である。
以上説明したように、各階層の画像処理部(212、222、・・・232)は、多様なパターンの画像処理を行うことが可能である。ここで、各階層の画像処理部(212、222、・・・232)が画像処理を行うタイミングは特に限定されない。以下、各階層の画像処理部(212、222、・・・232)が画像処理を行うタイミングの一例を説明する。
以上、本開示の第1の実施形態による画像処理装置10の構成を説明した。続いて、本開示の第1の実施形態による画像処理装置10の動作を説明する。
以上、本開示の第1の実施形態を説明した。続いて、本開示の第2の実施形態を説明する。第1の実施形態では、オリジナル画像を縮小することにより解像度の異なる複数の画像を取得する例を説明したが、第2の実施形態では、撮像部12による撮像の段階で解像度の異なる複数の画像を取得することができる。以下、詳細に説明する。
図10は、第2の実施形態による画像処理装置10の階層画像処理部20-2の構成を示したブロック図である。図10に示したように、階層画像処理部20-2は、第1階層処理部210~第K階層処理部230と、処理結果伝送部228および238と、を備える。
ここで、解像度の異なる複数の画像を取得するための撮像部12における露光タイミングと、各階層の処理タイミングについて図11および図12を参照して説明する。
以上、本開示の第2の実施形態による画像処理装置10の構成を説明した。続いて、本開示の第2の実施形態による画像処理装置10の動作を説明する。
以上説明したように、本開示の実施形態によれば、解像度の異なる複数の階層で画像処理を行い、各階層での画像処理結果を合成する。このため、高解像度を有する画像に対してのみ画像処理を行う場合と比較してハードウェア規模およびコストを抑制することが可能であり、かつ、低解像度を有する画像に対してのみ画像処理を行う場合と比較して高精細な画像を得ることが可能である。
(1)
同一被写体についての解像度が異なる複数の画像の各々に対して画像処理を行う画像処理部と、
前記複数の画像の各々についての前記画像処理部による画像処理結果を合成する画像合成部と、
を備える、画像処理装置。
(2)
前記画像処理部は、前記複数の画像の各々に対して同一種別の画像処理を行う、前記(1)に記載の画像処理装置。
(3)
前記画像処理部は、前記複数の画像の各々に対して異なる種別の画像処理を行う、前記(1)に記載の画像処理装置。
(4)
前記画像処理装置は、低解像度側の画像についての前記画像処理部による画像処理結果を高解像度側の画像の解像度にアップスケールするアップスケール部をさらに備え、
前記画像合成部は、前記高解像度側の画像についての前記画像処理部による画像処理結果と、前記アップスケール部によりアップスケールされた前記低解像度側の画像についての前記画像処理部による画像処理結果とを合成する、前記(1)~(3)のいずれか一項に記載の画像処理装置。
(5)
前記画像処理装置は、入力画像から前記解像度が異なる複数の画像を得るための画像変換を行う画像変換部をさらに備える、前記(1)~(4)のいずれか一項に記載の画像処理装置。
(6)
前記画像処理装置は、異なる露光タイミングおよび異なる解像度で撮像を行うことにより前記複数の画像を得る撮像部をさらに備える、前記(1)~(4)のいずれか一項に記載の画像処理装置。
(7)
前記画像処理部は、前記複数の画像の画像処理を、前記撮像部の撮像により得られた順序で開始する、前記(6)に記載の画像処理装置。
(8)
前記撮像部は、前記複数の画像を高解像度側の画像から順に取得する、前記(6)または(7)に記載の画像処理装置。
(9)
前記撮像部は、前記複数の画像を低解像度側の画像から順に取得する、前記(6)または(7)に記載の画像処理装置。
(10)
前記画像処理部は、少なくとも輝度ノイズリダクション、およびクロマノイズリダクションを画像処理として行い、
前記輝度ノイズリダクションを、前記クロマノイズリダクションを行う画像よりも高解像度側の画像に対して行う、前記(3)に記載の画像処理装置。
(11)
前記画像処理部は、さらに欠陥補正を画像処理として行い、
前記欠陥補正を、前記輝度ノイズリダクションを行う画像よりも高解像度側の画像に対して行う、前記(10)に記載の画像処理装置。
(12)
同一被写体についての解像度が異なる複数の画像の各々に対して画像処理を行うことと、
前記複数の画像の各々についての画像処理結果を合成することと、
を含む、画像処理方法。
12 撮像部
14 記憶部
20 階層画像処理部
30 表示部
202、203 画像縮小部
210 第1階層処理部
212 画像処理部
214 画像合成部
220 第2階層処理部
222 画像処理部
224 画像合成部
226 アップスケール部
228 処理結果伝送部
230 第K階層処理部
232 画像処理部
236 アップスケール部
238 処理結果伝送部
Claims (12)
- 同一被写体についての解像度が異なる複数の画像の各々に対して画像処理を行う画像処理部と、
前記複数の画像の各々についての前記画像処理部による画像処理結果を合成する画像合成部と、
を備える、画像処理装置。 - 前記画像処理部は、前記複数の画像の各々に対して同一種別の画像処理を行う、請求項1に記載の画像処理装置。
- 前記画像処理部は、前記複数の画像の各々に対して異なる種別の画像処理を行う、請求項1に記載の画像処理装置。
- 前記画像処理装置は、低解像度側の画像についての前記画像処理部による画像処理結果を高解像度側の画像の解像度にアップスケールするアップスケール部をさらに備え、
前記画像合成部は、前記高解像度側の画像についての前記画像処理部による画像処理結果と、前記アップスケール部によりアップスケールされた前記低解像度側の画像についての前記画像処理部による画像処理結果とを合成する、請求項1に記載の画像処理装置。 - 前記画像処理装置は、入力画像から前記解像度が異なる複数の画像を得るための画像変換を行う画像変換部をさらに備える、請求項1に記載の画像処理装置。
- 前記画像処理装置は、異なる露光タイミングおよび異なる解像度で撮像を行うことにより前記複数の画像を得る撮像部をさらに備える、請求項1に記載の画像処理装置。
- 前記画像処理部は、前記複数の画像の画像処理を、前記撮像部の撮像により得られた順序で開始する、請求項6に記載の画像処理装置。
- 前記撮像部は、前記複数の画像を高解像度側の画像から順に取得する、請求項6または7に記載の画像処理装置。
- 前記撮像部は、前記複数の画像を低解像度側の画像から順に取得する、請求項6または7に記載の画像処理装置。
- 前記画像処理部は、少なくとも輝度ノイズリダクション、およびクロマノイズリダクションを画像処理として行い、
前記輝度ノイズリダクションを、前記クロマノイズリダクションを行う画像よりも高解像度側の画像に対して行う、請求項3に記載の画像処理装置。 - 前記画像処理部は、さらに欠陥補正を画像処理として行い、
前記欠陥補正を、前記輝度ノイズリダクションを行う画像よりも高解像度側の画像に対して行う、請求項10に記載の画像処理装置。 - 同一被写体についての解像度が異なる複数の画像の各々に対して画像処理を行うことと、
前記複数の画像の各々についての画像処理結果を合成することと、
を含む、画像処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12868308.3A EP2814238A1 (en) | 2012-02-09 | 2012-11-15 | Image processing device and image processing method |
CN201280068849.1A CN104106259B (zh) | 2012-02-09 | 2012-11-15 | 图像处理设备和图像处理方法 |
CA 2854280 CA2854280A1 (en) | 2012-02-09 | 2012-11-15 | Image processing apparatus and image processing method |
RU2014131903A RU2014131903A (ru) | 2012-02-09 | 2012-11-15 | Устройство обработки изображения и способ обработки изображения |
US14/373,995 US9240065B2 (en) | 2012-02-09 | 2012-11-15 | Image processing apparatus and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-025904 | 2012-02-09 | ||
JP2012025904 | 2012-02-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013118364A1 true WO2013118364A1 (ja) | 2013-08-15 |
Family
ID=48947151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/079622 WO2013118364A1 (ja) | 2012-02-09 | 2012-11-15 | 画像処理装置および画像処理方法 |
Country Status (7)
Country | Link |
---|---|
US (1) | US9240065B2 (ja) |
EP (1) | EP2814238A1 (ja) |
JP (1) | JPWO2013118364A1 (ja) |
CN (1) | CN104106259B (ja) |
CA (1) | CA2854280A1 (ja) |
RU (1) | RU2014131903A (ja) |
WO (1) | WO2013118364A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015111237A1 (ja) * | 2014-01-24 | 2015-07-30 | 株式会社東芝 | 画像処理システム、画像処理方法及び画像処理プログラム |
WO2018061508A1 (ja) * | 2016-09-28 | 2018-04-05 | ソニー株式会社 | 撮像素子、画像処理装置、および画像処理方法、並びにプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5982751B2 (ja) | 2011-08-04 | 2016-08-31 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
CN108429883A (zh) * | 2018-05-30 | 2018-08-21 | Oppo(重庆)智能科技有限公司 | 拍摄装置、电子设备及图像获取方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006310999A (ja) * | 2005-04-27 | 2006-11-09 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2007243917A (ja) * | 2006-02-10 | 2007-09-20 | Nikon Corp | 撮像装置および画像処理プログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4305598B2 (ja) | 2000-06-05 | 2009-07-29 | 富士フイルム株式会社 | カメラの絞り制御方法及び装置、並びにカメラ |
US7460178B2 (en) * | 2004-05-14 | 2008-12-02 | Sony Corporation | Image processing apparatus and image processing method |
JP2005326690A (ja) * | 2004-05-14 | 2005-11-24 | Sony Corp | 画像処理装置及び画像処理方法 |
US8477210B2 (en) * | 2008-11-21 | 2013-07-02 | Mitsubishi Electric Corporation | Image processing device and image processing method |
US8687859B2 (en) * | 2009-10-14 | 2014-04-01 | Carestream Health, Inc. | Method for identifying a tooth region |
-
2012
- 2012-11-15 EP EP12868308.3A patent/EP2814238A1/en not_active Withdrawn
- 2012-11-15 RU RU2014131903A patent/RU2014131903A/ru unknown
- 2012-11-15 CN CN201280068849.1A patent/CN104106259B/zh active Active
- 2012-11-15 JP JP2013557368A patent/JPWO2013118364A1/ja active Pending
- 2012-11-15 US US14/373,995 patent/US9240065B2/en active Active
- 2012-11-15 CA CA 2854280 patent/CA2854280A1/en not_active Abandoned
- 2012-11-15 WO PCT/JP2012/079622 patent/WO2013118364A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006310999A (ja) * | 2005-04-27 | 2006-11-09 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2007243917A (ja) * | 2006-02-10 | 2007-09-20 | Nikon Corp | 撮像装置および画像処理プログラム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015111237A1 (ja) * | 2014-01-24 | 2015-07-30 | 株式会社東芝 | 画像処理システム、画像処理方法及び画像処理プログラム |
JP2015138519A (ja) * | 2014-01-24 | 2015-07-30 | 株式会社東芝 | 画像処理システム、画像処理方法及び画像処理プログラム |
WO2018061508A1 (ja) * | 2016-09-28 | 2018-04-05 | ソニー株式会社 | 撮像素子、画像処理装置、および画像処理方法、並びにプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN104106259A (zh) | 2014-10-15 |
RU2014131903A (ru) | 2016-02-20 |
US9240065B2 (en) | 2016-01-19 |
EP2814238A1 (en) | 2014-12-17 |
JPWO2013118364A1 (ja) | 2015-05-11 |
US20140341483A1 (en) | 2014-11-20 |
CA2854280A1 (en) | 2013-08-15 |
CN104106259B (zh) | 2018-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5694293B2 (ja) | ビデオフレームの画像データを選択的に合成するためのシステムおよび方法 | |
US8363123B2 (en) | Image pickup apparatus, color noise reduction method, and color noise reduction program | |
JP5746296B2 (ja) | ハイダイナミックレンジ動画撮影方法 | |
US9392236B2 (en) | Image processing method, image signal processor, and image processing system including the same | |
US20080260291A1 (en) | Image downscaling by binning | |
JP5513978B2 (ja) | 撮像装置、集積回路および画像処理方法 | |
JP6086663B2 (ja) | 画像処理装置、内視鏡装置及び孤立点ノイズ補正方法 | |
JP2011146806A (ja) | 画質評価装置、端末装置、画質評価システム、画質評価方法及びプログラム | |
US20080075354A1 (en) | Removing singlet and couplet defects from images | |
CN103856720A (zh) | 图像处理设备和方法 | |
WO2013118364A1 (ja) | 画像処理装置および画像処理方法 | |
JP2017188760A (ja) | 画像処理装置、画像処理方法、コンピュータプログラム及び電子機器 | |
JP2012142827A (ja) | 画像処理装置および画像処理方法 | |
JP2014119997A (ja) | 画像処理装置およびその制御方法 | |
JP2015144339A (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP2013016896A (ja) | 撮像装置、画像合成方法、及びコンピュータプログラム | |
US9277120B2 (en) | Image capturing apparatus provided with a peaking function, control method therefor, and storage medium | |
JP2006049950A (ja) | 動画像変換装置、動画像復元装置、および方法、並びにコンピュータ・プログラム | |
JP2015231052A (ja) | 撮像装置および方法、並びにプログラム | |
JP2011124694A (ja) | 画像処理装置と画像処理方式および撮像装置 | |
JP5748513B2 (ja) | 撮像装置 | |
JP5919086B2 (ja) | 画像処理装置及びその制御方法、並びにプログラム | |
JP2018067868A (ja) | 撮像装置 | |
JP6173027B2 (ja) | 画像処理装置および画像処理方法 | |
JP2010273147A (ja) | 撮像信号特定状態検出装置、撮像装置、撮像信号特定状態検出方法、プログラムおよび集積回路 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12868308 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012868308 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2854280 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2013557368 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14373995 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2014131903 Country of ref document: RU Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |