WO2017038090A1 - Image processing device, image processing method, and recording medium whereon image processing program is stored - Google Patents

Image processing device, image processing method, and recording medium whereon image processing program is stored Download PDF

Info

Publication number
WO2017038090A1
WO2017038090A1 PCT/JP2016/003969 JP2016003969W WO2017038090A1 WO 2017038090 A1 WO2017038090 A1 WO 2017038090A1 JP 2016003969 W JP2016003969 W JP 2016003969W WO 2017038090 A1 WO2017038090 A1 WO 2017038090A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
gradation
luminance
resolution
processing apparatus
Prior art date
Application number
PCT/JP2016/003969
Other languages
French (fr)
Japanese (ja)
Inventor
融 濡木
友宏 松木
松本 道弘
Original Assignee
日本電気航空宇宙システム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気航空宇宙システム株式会社 filed Critical 日本電気航空宇宙システム株式会社
Priority to JP2017537550A priority Critical patent/JP6744317B2/en
Publication of WO2017038090A1 publication Critical patent/WO2017038090A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction

Definitions

  • the present invention relates to a technique for generating high-resolution image data using a plurality of image data having different resolutions.
  • a technique for obtaining a high-resolution multispectral image from an image having a high resolution and an image having a low resolution is known.
  • a high-resolution multispectral image is created using a panchromatic image that is a high-resolution image and a multispectral image that is a low-resolution image.
  • Processing to perform is known.
  • pan-sharpening process or tone mapping process are disclosed in the following patent documents.
  • Patent Document 1 International Publication No. 2015/037189 discloses a technique related to noise removal in an image generated by pan-sharpening processing.
  • the technique disclosed in Patent Document 1 performs multiple resolution decomposition of a high-resolution image to a resolution corresponding to the low-resolution image. Such a technique reduces noise by correcting the decomposed image using a low-resolution image.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2014-174817 discloses that when tone mapping processing is performed on a plurality of high-tone images partially overlapping each other, luminance correction of each image is performed based on the luminance value distribution of each image. Disclosed is a technique for determining coefficients used in the above.
  • Japanese Patent Laid-Open No. 2013-109759 discloses a technique related to pan-sharpening processing using dictionary learning.
  • the technique disclosed in Patent Document 3 extracts a feature vector from a panchromatic image and a multispectral image, and decomposes the feature vector into a vector having no missing value and a vector having a missing value. .
  • the technique disclosed in Patent Literature 3 predicts a missing value (color information in a high-resolution image) based on a dictionary learned using a vector having no missing value.
  • Patent Document 4 Japanese Patent Application Laid-Open No. 2011-1000042 uses a panchromatic image captured from a satellite and a color (multispectral) image that is geometrically transformed so as to be superposed on the panchromatic image. Disclosed is a technique for executing the process.
  • Japanese Patent Laid-Open No. 2015-033582 discloses pan sharpening using sparse spectral data detected by photon counting detection and dense panchromatic data detected by an energy integrator in an X-ray computed tomography apparatus. A technique for executing processing is disclosed.
  • Patent Document 6 U.S. Pat. No. 8,737,733 discloses a statistical technique in which the luminance value of a panchromatic image and the luminance value of a multispectral image are close to each other in order to improve the color of a pan-sharpened image. A technique for converting using a model is disclosed.
  • Non-Patent Document 1 discloses a technique for reducing information loss by decomposing a high gradation image into a base layer and a detail layer and performing range compression in accordance with the depicted scene.
  • Tone mapping processing that locally controls the range compression rate so as to reduce local information loss in image data has a high calculation cost. That is, when such processing is applied to a high-resolution panchromatic image or pan-sharpened image, there is a problem that the processing load is high.
  • tone mapping and pan-sharpening are executed, a high-resolution image with appropriate gradation is generated based on the gradation (for example, luminance gradation) of the original multispectral image or panchromatic image. It is required to do. None of the above patent documents is a technology that can directly solve such problems.
  • the main object of the present invention is to provide a technology capable of efficiently generating a high-resolution image having an appropriate gradation using a plurality of images having different resolutions.
  • an image processing apparatus has the following arrangement. That is, an image processing apparatus according to an aspect of the present invention includes a gradation adjustment image generation unit that generates a first gradation conversion image by converting gradations of a first image having one or more color information. , Feature information relating to the gradation of the second image generated based on the first image and a second image having a higher resolution than the first image, and the first gradation conversion By adjusting the first image using the gradation of the image, a composite image that generates one or more color information and generates a composite image that is at least a higher resolution than the first image A generating unit.
  • the image processing method generates a first gradation-converted image by converting gradations of pixels included in a first image having one or more color information, and Characteristic information about the gradation of the second image, the first gradation-converted image, and the second gradation image generated based on the first image and the second image having a higher resolution than the first image,
  • the first image is used to generate a composite image that has one or more pieces of color information and is at least a higher resolution than the first image.
  • the object can also be achieved by an image processing apparatus having the above configuration or a computer program for realizing the image processing method by a computer, and a computer-readable recording medium in which the computer program is stored. Achieved.
  • FIG. 1 is a block diagram illustrating a functional configuration of an image processing apparatus according to the first embodiment of the present invention.
  • FIG. 2 is an explanatory diagram schematically showing an image having the same resolution as the panchromatic image.
  • FIG. 3 is a flowchart showing a specific example of the operation of the image processing apparatus according to the first embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a functional configuration of an image processing apparatus according to the second embodiment of the present invention.
  • FIG. 5 is a block diagram illustrating a functional configuration of an image processing apparatus according to a modification of the second embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a configuration of a hardware device capable of realizing the image processing apparatus according to each embodiment of the present invention.
  • FIG. 7 is a diagram schematically illustrating an example of a cumulative histogram when the luminance of the multispectral image before and after tone mapping and the luminance of the panchromatic image are normalized.
  • pan-sharp that creates a high-resolution multispectral image from a high-resolution panchromatic image and a low-resolution multispectral image
  • pan-sharpening processing is used, for example, as processing for improving the visibility of a satellite image captured by an artificial satellite.
  • the panchromatic image is, for example, image data composed of a single color (monochrome) with a high spatial resolution.
  • the multispectral image is image data including a plurality of pieces of spectral information (for example, color information in the visible light region), although the spatial resolution is lower than that of the panchromatic image.
  • the content captured in the panchromatic image and the content captured in the multispectral image may be the same or substantially the same. In addition, as long as it is possible to determine the correspondence between the content captured in one image and the content captured in the other image, the content captured in each image may not be the same.
  • a method for capturing (or generating) these images for example, imaging using a multispectral sensor and a panchromatic sensor
  • a well-known technique can be employed.
  • the image data may have a gradation higher than the gradation that can be output by the display system (for example, graphic board, display, printer, etc.).
  • the display system for example, graphic board, display, printer, etc.
  • satellite images captured by artificial satellites often have gradations higher than the gradations that can be output by the display system.
  • the display range is compressed (gradation is crushed), and information loss occurs.
  • a tone mapping process is known that reduces the loss of information by converting (compressing) the range in the gradation direction.
  • a tone mapping process that performs different processing for each local area in an image which is generally called a local tone mapping process, has a high calculation cost.
  • the processing load is high.
  • the luminance distribution of a panchromatic image there is a correlation between the luminance (luminance distribution) of a panchromatic image and the luminance (luminance distribution) of a multispectral image. That is, it is assumed that a portion having high luminance in the panchromatic image has high luminance in the multispectral image and vice versa.
  • tone mapping processing is performed on a multispectral image
  • the luminance distribution of the multispectral image is compressed, for example, to a certain range centered on a specific luminance average. That is, the shape of the luminance distribution is different between the original multispectral image and the tone-mapped multispectral image.
  • the correlation between the luminance of the multispectral image and the luminance of the panchromatic image is lost.
  • pan-sharpening processing there is a case where processing for associating the panchromatic image with the luminance of the multispectral image (luminance matching) is executed.
  • luminance matching since the correlation is lost between the luminance of the tone-mapped multispectral image and the luminance of the panchromatic image, appropriate luminance matching cannot be performed.
  • a pan-sharpened image having an appropriate gradation may not be generated.
  • the technique according to the present invention exemplified using the following embodiments can reduce the amount of data handled in the tone mapping process by incorporating the tone mapping process into the pan sharpening process, for example. .
  • the technology according to the present invention can realize high-precision luminance matching processing in, for example, pan-sharpening processing.
  • the technology according to the present invention can efficiently create a pan-sharpened image subjected to tone mapping processing.
  • the image processing apparatus described below may be configured using a single apparatus (physical or virtual apparatus). Such an image processing apparatus may be realized using a plurality of apparatuses (physical or virtual apparatuses). That is, in this case, the image processing apparatus is realized as a system including a plurality of apparatuses. A plurality of devices constituting such an image processing device may be communicably connected via a communication network (communication line) that is wired, wireless, or an arbitrary combination thereof.
  • the communication network may be a physical communication network or a virtual communication network.
  • FIG. 1 is a block diagram illustrating a functional configuration of the image processing apparatus 100.
  • the direction of the arrow in FIG. 1 is a display showing an example of the transfer direction of the image data, and does not limit the present embodiment.
  • the image processing apparatus 100 is communicably connected to an input apparatus 200 that receives a low-resolution multispectral image and a high-resolution panchromatic image.
  • the image processing apparatus 100 is communicably connected to an output apparatus 300 that outputs a pan-sharpened image generated by the image processing apparatus 100.
  • a communication method for connecting the image processing apparatus 100, the input apparatus 200, and the output apparatus 300 may be selected as appropriate.
  • the image processing apparatus 100 includes an enlargement processing unit 101, a luminance extraction unit 102, a tone mapping processing unit 103, an enlargement processing unit 104, a local feature extraction unit 105, an output luminance calculation unit 106, and an image composition unit 107. . These components constituting the image processing apparatus 100 may be communicably connected using an arbitrary communication method.
  • the enlargement processing unit 101 receives a multispectral image (sometimes referred to as a “first image”) from the input device 200 described later.
  • a multispectral image is, for example, a color image including a plurality of pieces of spectral information (color information) as described above.
  • the multispectral image is an image having a lower spatial resolution than a panchromatic image (sometimes referred to as a “second image”) that the image processing apparatus 100 separately receives from the input device 200.
  • the enlargement processing unit 101 converts the resolution of the multispectral image received from the input device 200. Specifically, the enlargement processing unit 101 up-samples, for example, the resolution of the multispectral image so as to be the same resolution as the panchromatic image received from the input device 200. As a method for converting the resolution of an image (upsampling method), a known method may be employed.
  • the luminance extraction unit 102 extracts gradation information related to the image from the input multispectral image.
  • the luminance extraction unit 102 may extract gradation information related to the luminance of the input multispectral image.
  • the gradation information regarding the luminance of the image may be simply referred to as “luminance information”.
  • the luminance extraction unit 102 may extract the luminance (luminance value) for each pixel constituting the multispectral image, for example, and use the set as luminance information.
  • the luminance information may be, for example, a luminance image generated by extracting only the luminance of each pixel included in the input multispectral image.
  • the luminance range of the multispectral image (for example, the number of quantization bits) and the luminance range of the panchromatic image may be the same or different.
  • the luminance extraction unit 102 may extract other information that can express the light and shade of each pixel constituting the image, other than “luminance”, as the gradation information regarding the image.
  • the tone mapping processing unit 103 compresses the luminance range by executing tone mapping on the low-resolution (low spatial resolution) luminance information extracted from the multispectral image by the luminance extraction unit 102 (range). Compression process). For example, the tone mapping processing unit 103 may execute the range compression processing by converting (compressing) the luminance value included in the luminance information extracted by the luminance extracting unit 102 so as to be within a certain luminance range. . For example, the tone mapping processing unit 103 may perform range compression processing on the luminance image extracted from the multispectral image by the luminance extraction unit 102.
  • the luminance information (luminance image) on which the range compression processing has been performed may be referred to as a “first gradation converted image”.
  • the enlargement processing unit 104 converts the resolution of the luminance information (the first gradation conversion image) that has undergone range compression in the tone mapping processing unit 103. Similar to the enlargement processing unit 101, the enlargement processing unit 104 may upsample the luminance information so as to have the same resolution as the panchromatic image received from the input device 200.
  • the local feature extraction unit 105 generates local feature information based on information (luminance change information) representing a local luminance change in the panchromatic image input from the input device 200.
  • the luminance change information may be a value indicating the degree of change in luminance.
  • the local feature extraction unit 105 performs matching (luminance matching) between the luminance of the panchromatic image input from the input device 200 and the luminance of the multispectral image, for example.
  • the local feature extraction unit 105 converts local luminance change information in the panchromatic image into a luminance range of the multispectral image based on the result of the luminance matching.
  • Such a converted panchromatic image may be referred to as a “second gradation converted image”.
  • the local feature extraction unit 105 generates local feature information based on the second tone conversion image.
  • the output luminance calculation unit 106 generates high-resolution luminance information using the local features generated by the local feature extraction unit 105 and the luminance information (range compressed) that has been upsampled by the enlargement processing unit 104. .
  • the output luminance calculation unit 106 calculates the high-resolution luminance information by multiplying, for example, the local feature generated by the local feature extraction unit 105 and the luminance information upsampled by the enlargement processing unit 104. May be.
  • the high-resolution luminance information generated by the output luminance calculation unit 106 includes local features of the panchromatic image and global features in which the luminance range is compressed.
  • the image composition unit 107 generates a pan-sharpened image based on the multispectral image upsampled by the enlargement processing unit 101 and the high-resolution luminance information generated by the output luminance calculation unit 106.
  • the image synthesis unit 107 synthesizes the pan-sharpened image by replacing the luminance information of the multispectral image upsampled by the enlargement processing unit 101 with the high-resolution luminance information generated by the output luminance calculation unit 106, for example. May be.
  • the image composition unit 107 may set the color component (color information) of the pan-sharpened image using the color information of the multispectral image upsampled by the enlargement processing unit 101.
  • the pan-sharpened image may be referred to as a “composite image”.
  • the specific method for executing the pan sharpening process is not limited to the above.
  • the input device 200 includes a multispectral image input unit 201 and a panchromatic image input unit 202.
  • the input device 200 may be connected to an image generation device (not shown) that captures a multispectral image and a panchromatic image.
  • the input device 200 may include such an image generation device.
  • the input device 200 supplies (provides) the multispectral image and panchromatic image received as input to the image processing device 100.
  • the multispectral image input unit 201 receives, as an input, a multispectral image captured using, for example, a multispectral sensor, and supplies the multispectral image to the image processing apparatus 100.
  • a multispectral image may be sent to the enlargement processing unit 101 and the luminance extraction unit 102 in the image processing apparatus 100, for example.
  • the panchromatic image input unit 202 receives, for example, a panchromatic image captured using a panchromatic sensor or the like, and supplies the panchromatic image to the image processing apparatus 100. Such a panchromatic image may be sent to the local feature extraction unit 105 in the image processing apparatus 100, for example.
  • the output device 300 receives the pan-sharpened image generated by the image processing device 100 and outputs it to a display system (display device or the like) not shown.
  • the output device 300 may output the received pan-sharpened image to a display system such as a graphic processing device, various display devices, or a printer in an information processing device such as a computer.
  • the output device 300 may include such a display system.
  • FIG. 3 is a flowchart illustrating the operation of the image processing apparatus 100 according to this embodiment.
  • the flowchart illustrated in FIG. 3 is one specific example, and the order of each process can be appropriately changed within a range that does not affect the process result.
  • each process of the flowchart illustrated in FIG. 3 may be executed in parallel as long as the process result is not affected.
  • the low-resolution multispectral image and the high-resolution panchromatic image received by the input device 200 are provided to the image processing device 100 (step S3001).
  • a multispectral image may be represented as “M” and a panchromatic image may be represented as “P”.
  • the image processing apparatus 100 converts the resolution of the provided multispectral image (step S3002).
  • the enlargement processing unit 101 may upsample the multispectral image to the same resolution as the panchromatic image by using, for example, a bilinear method.
  • the enlargement processing unit 101 may appropriately use a method other than the bilinear method (for example, a bicubic interpolation method), and the upsampling method is not limited to a specific method.
  • the up-sampled multispectral image may be represented as “U (M)”. Note that the processing in step S3002 and the processing in steps S3003 to S3007 described below may be executed in parallel.
  • the luminance extraction unit 102 may extract the luminance information “I M ” by performing color space conversion processing on the multispectral image M, for example.
  • the conversion of the color space may be, for example, a process of converting the color space of the multispectral image “M” into a color space such as HLS (Hue, Lightness, Saturation), YCbCr, HCS (Hyperspherical Color Space). .
  • the method for converting the color space is not limited to a specific method.
  • the luminance extraction unit 102 may extract the luminance information using a known method other than the color space conversion described above.
  • the luminance information “I M ” includes the luminance value of each pixel included in the multispectral image M and has a resolution equivalent to that of the multispectral image “M”.
  • the tone mapping processing unit 103 separates (decomposes) the luminance information “I M ” of the multispectral image “M” extracted by the luminance extracting unit 102 into a global feature component and a local feature component.
  • the global feature component represents, for example, a spatially smooth luminance change in the image
  • the local feature represents, for example, a local luminance change around a certain target pixel.
  • information representing spatial illumination light included in an image may be used.
  • information representing the reflectance of an object included in the image may be used.
  • light incident on the human visual system is represented by the product of the illumination light and the reflectance of the object, and human perception (sight) is known to be sensitive to the reflectance of the object. Yes. Accordingly, for example, by compressing the illumination light component (range compression), it is possible to effectively compress the luminance range of the entire image while maintaining the reflectance component that affects the perception of the eye.
  • range compression range compression
  • the tone mapping processing unit 103 inputs the luminance information “I M ” of the multispectral image as “I” in the following formula (1), and uses the “I M ” as the global feature component “B ( I) "and the local feature component” T (I) ".
  • x ij (I) is a pixel value (real number) in the pixel in the “i” row “j” column of the image (or luminance information) “I”.
  • the pixel value may be a luminance value in the pixel.
  • the image “I” may be an arbitrary image (or luminance information, etc.)
  • B (I) represents a global feature component of the image “I”
  • T (I) Represents a local feature component of the image“ I ”.
  • the tone mapping processing unit 103 applies the global feature component “B (I M )” regarding “I M ” by applying an edge preserving smoothing filter to the luminance information “I M ”, for example. You may ask for it. Assuming that the illumination light component included in the image changes more slowly than the reflectance component from the object, it is possible to extract the illumination light component by using the edge preserving smoothing filter. is there.
  • the edge preserving smoothing filter may be any filter that preserves edge information included in an image and can average (smooth) the image.
  • a k-nearest neighbor averaging filter, a bilateral filter, a median filter, or a WLS filter (Weighted Led Squares Filter) may be employed.
  • the tone mapping processing unit 103 may obtain the local feature component “T (I M )” using, for example, the following equation (2).
  • “I” represents an arbitrary image (or luminance information) (that is, if “I” in Expression (2) is replaced with “I M ”, local feature component “T” from Expression (2). (I M ) "is obtained).
  • the local feature component “T (I M )” is obtained from the ratio (brightness ratio) between the luminance of a certain pixel included in the luminance information “I M ” and the global feature.
  • the local feature component “T (I M )” represents the degree of change in luminance value with respect to the global feature component “B (I M )” in each pixel.
  • the global feature component “B (I M )” represents the illumination light component in the multispectral image
  • the local feature component “T (I M )” is considered to represent the reflectance from the object.
  • the tone mapping processing unit 103 applies each element (pixel) “x ij (B (I M ))” of the global feature component “B (I M )”. Then, by applying the smooth range compression function “f”, the range (pixel value range) is compressed. Then, the tone mapping processing unit 103 performs the range-compressed pixel value of each element and the pixel value of each element “x ij (T (I M ))” of the local feature component previously separated by Expression (1). Multiply with.
  • the tone mapping processing unit 103 obtains luminance information “R (I M )” (first gradation conversion image) of the multispectral image subjected to tone mapping as described above.
  • the compression function “f” for example, a function represented by the following expression (4) is used.
  • the compression function illustrated in Expression (4) is a compression function that takes into account human visual characteristics, and can visually compress equivalently regardless of the magnitude of the luminance value of each pixel.
  • the tone mapping processing unit 103 calculates a compressed luminance value by substituting the luminance value of each pixel of the multispectral image into Equation (4).
  • tone mapping processing unit 103 may adopt a function other than Expression (4) as the compression function “f”. That is, an appropriate compression function “f” may be appropriately selected according to the purpose of tone mapping.
  • the image processing apparatus 100 converts the resolution of the luminance information “R (I M )” of the multispectral image that has undergone range compression in the tone mapping processing unit 103 (step S3005). Specifically, the enlargement processing unit 104 upsamples the luminance information “R (I M )” to the same resolution as the panchromatic image “P”. A specific upsampling method may be the same as that of the enlargement processing unit 101. Hereinafter, the luminance information after upsampling is expressed as “U o R (I M )”.
  • the image processing apparatus 100 includes the luminance information “I M ” extracted by the luminance extraction unit 102 and the panchromatic image “P” in a high-resolution image.
  • Local feature information (feature amount) is generated (step S3006). Note that the processing in step S3006 described below may be executed in parallel with the processing in steps S3004 and S3005 described above.
  • the local feature extraction unit 105 generates local feature information by executing the following processing, for example.
  • the local feature extraction unit 105 performs luminance matching (histogram matching) using a luminance histogram of the panchromatic image “P” and the luminance information “I M ” of the multispectral image. Accordingly, the local feature extraction unit 105 performs panchromatic image “P ⁇ ” (“second gradation conversion image” in which the pixel value (luminance value) of each pixel is converted in accordance with the luminance range of “I M ”. )).
  • the local feature extraction unit 105 can execute the luminance matching with high accuracy.
  • the local feature extraction unit 105 calculates local feature information (local feature amount) “ ⁇ ” included in the panchromatic image “P ⁇ ”.
  • the local feature information “ ⁇ ” may represent, for example, the degree of local luminance change in the panchromatic image “P ⁇ ” (for example, the degree of luminance change in the target pixel).
  • the local feature information ⁇ is, for example, “P ⁇ ” when the result of smoothing the panchromatic image “P ⁇ ” is expressed as “S (P ⁇ )”.
  • the luminance ratio between the luminance of a certain pixel and the luminance of the corresponding pixel in "S (P ⁇ )" may be used.
  • FIG. 2 is a diagram schematically showing an image “I” having a resolution equal to that of the panchromatic image “P”.
  • the image “I” may be, for example, an image obtained by up-sampling the luminance information “I M ” of the multispectral image to the resolution of the panchromatic image.
  • an image “I” having a resolution equal to that of the panchromatic image “P” is obtained by replacing each pixel of luminance information “I M ” of the low-resolution multispectral image with “C L (I) ⁇ C P (I)”. It is considered that the image is subdivided into "".
  • C P (I) is a pixel spacing ratio between the image “I” and the panchromatic image “P” in the pixel direction (horizontal direction). “C P (I)” is calculated by, for example, “((total number of pixels in the pixel direction of the panchromatic image“ P ”) / (total number of pixels in the pixel direction of the image“ I ”before being subdivided))” ”. Is done.
  • C L (I) is a pixel spacing ratio between the image “I” and the panchromatic image “P” in the line direction (vertical direction). “C L (I)” is calculated by, for example, “((total number of pixels in line direction of panchromatic image“ P ”) / (total number of pixels in line direction of image“ I ”before being subdivided))” ” Is done.
  • the pixel value is expressed as “y mn ij (I)”.
  • step S3007 the image processing apparatus 100 (particularly, the output luminance calculation unit 106), based on the result of step S3005 and the result of step S3006, high-resolution luminance information (local information) is reflected ( Output brightness adjustment information) is generated (step S3007).
  • the output luminance calculation unit 106 uses the local feature amount “ ⁇ mn ij ” obtained by the local feature extraction unit 105 as high-resolution luminance information “y mn ij ” obtained by the enlargement processing unit 104.
  • the result obtained by multiplying (U o R (I M )) ” may be output luminance adjustment information.
  • the output luminance adjustment information is expressed in a format of “ ⁇ mn ij * y mn ij (U o R (I M ))”.
  • the global feature of “U ⁇ R (I M )” is subjected to range compression by the tone mapping processing unit 103 with respect to the luminance “I M ” of the multispectral image “M”. Is the up-sampled luminance information.
  • luminance information output luminance adjustment information
  • luminance information including local features derived from the panchromatic image “P” and global features derived from the multispectral image “M” with a compressed luminance range. Is obtained.
  • the image processing apparatus 100 (in particular, the image composition unit 107) performs panning based on the output luminance adjustment information and the upsampled multispectral image (“U (M)”) generated by the enlargement processing unit 101. A sharpened image is generated (step S3008).
  • the image composition unit 107 performs color space conversion on the multispectral image “U (M)” upsampled by the enlargement processing unit 101, and separates luminance information and color information.
  • the conversion of the color space may be the same as the process in step S3003, for example.
  • the image composition unit 107 converts the luminance information separated in the image “U (M)” into the output luminance adjustment information “ ⁇ mn ij * y mn ij (U o R ( I M )) ”.
  • the image combining unit 107 combines the replaced luminance information and the color information separated from U (M), and returns (converts) the combined image to the original color space again.
  • the image composition unit 107 obtains a pan-sharpened image on which tone mapping processing has been performed.
  • the image composition unit 107 provides the generated pan-sharpened image to the output device 300.
  • the output device 300 outputs the pan-sharpened image supplied from the image processing device 100 (step S3009).
  • the brightness information (output brightness adjustment information) “ ⁇ mn ij * y mn ij (U o R (I M ))” generated by the output brightness calculating unit 106 is generated using a well-known method. It will be explained that the result of tone mapping can be approximated with high accuracy.
  • the luminance information of the pan-sharpened image generated using well-known methods is labeled “I Q”, the "I Q” the pixel value of each pixel of the result of tone mapping (luminance value), " y mn ij (R (I Q )) ”.
  • Equation (7) represents the relationship between the luminance information “I Q ” of the pan-sharpened image generated using a known method and the luminance information “I M ” of the multispectral image.
  • “U (I M )” represents the result of up-sampling the luminance information “I M ” of the multispectral image.
  • “Y mn ij (U (I M ))” represents a pixel value (luminance value) of a specific pixel in the result of up-sampling the luminance information “I M ” of the multispectral image.
  • “ ⁇ mn ij ” represents a local feature amount.
  • the image processing apparatus 100 up-samples the global characteristic component “B (I Q )” of the luminance information “I Q ” of the pan-sharpened image and the luminance information “I M ” of the multispectral image.
  • U can be approximated using a "global features component of" B ⁇ U (I M) "(I M) ( equation (9)). The reason will be described below.
  • the luminance information “I Q ” is a local luminance change of a scale (range) smaller than one pixel of the original multispectral image M with respect to the luminance value of “U (I M )”. It is calculated by multiplying the information “ ⁇ ” (“ ⁇ mn ij ”).
  • the local luminance change information “ ⁇ ” is a local feature amount obtained from the high-resolution panchromatic image P.
  • the local luminance change information “ ⁇ ” represents, for example, a change in luminance value in a range smaller than the range represented by one pixel constituting the (original) multispectral image M before upsampling.
  • the effect (influence) that the “ ⁇ ” representing the luminance change in a narrow range has on the global characteristics of the pan-sharpened image is sufficiently small (for example, it may be ignored).
  • a case is assumed in which a luminance component to which a smoothing filter is applied is used as a global feature.
  • the effect of the local brightness change information “ ⁇ ” is equalized by smoothing, so that the effect on the global feature is reduced.
  • the global feature component “B (I Q )” obtained from the luminance information of the pan-sharpened image is the global feature component “B ⁇ U for the image“ U (I M ) ”obtained by up-sampling the multispectral image. (I M ) "can be approximated.
  • the brightness range of an image does not change significantly before and after upsampling an image. That is, even if the range compression by tone mapping is performed at any timing before and after the multi-spectral image “M” is up-sampled, there is often no significant difference in the result. From the above, when the luminance information “I M ” of the multispectral image “M” is up-sampled after tone mapping, and expressed as “U o R (I M )”, in this embodiment, the expression (11) The approximation shown is considered to hold.
  • the image processing apparatus 100 performs the tone mapping on the luminance information “I M ” of the low-resolution multispectral image “M” (“R (I M )”. ) Is used to approximate luminance information “y mn ij (R (I Q ))” obtained by performing tone mapping on a high-resolution pan-sharpened image.
  • the image processing apparatus 100 of the present embodiment it is possible to avoid the execution of tone mapping processing for a high-resolution image (or luminance information), and thus the calculation cost is reduced.
  • the image processing apparatus 100 uses the multi-spectral image “M” that is not tone-mapped and the panchromatic image “P”, and uses the local feature information (local feature amount). ) Is extracted (step S3006). Since the multispectral image “M” is not tone-mapped, the luminance distribution between the multispectral image “M” and the panchromatic image “P” is not significantly different. That is, the image processing apparatus 100 can execute appropriate luminance matching using the luminance information of the panchromatic image “P” and the luminance information “I M ” of the multispectral image “M”.
  • the image processing apparatus 100 generates a pan-sharpened image having an appropriate gradation in which the gradation change (contrast) for each pixel derived from the high-gradation panchromatic image P is appropriately reflected. .
  • FIG. 7 is a diagram schematically illustrating an example of a cumulative histogram when the luminances of the multispectral image and the panchromatic image are normalized.
  • the solid line shown in FIG. 7A represents the normalized luminance cumulative frequency for the multispectral image.
  • the dotted line shown in FIG. 7B represents the normalized cumulative frequency of luminance for the tone-mapped multispectral image.
  • a one-dot chain line shown in C of FIG. 7 represents the cumulative frequency of normalized luminance regarding the panchromatic image.
  • G1 in FIG. 7 schematically represents a luminance ratio between specific luminances when a panchromatic image is subjected to luminance mapping with respect to a multispectral image.
  • G2 in FIG. 7 schematically represents a luminance ratio between specific luminances when luminance mapping is performed on a panchromatic image with respect to a multi-spectral image that has been tone-mapped.
  • the luminance of a multispectral image is compressed to a predetermined luminance range centered on a specific luminance average.
  • the cumulative frequency of the luminance (normalized) of the tone-mapped multispectral image changes more rapidly than the cumulative frequency of the original multispectral image (normalized).
  • the luminance ratio represented by G2 is smaller than the luminance ratio represented by G1. This is considered to indicate that the local contrast in the image (for example, the degree of change in luminance at a certain pixel) is small.
  • the image processing apparatus 100 performs luminance matching using a multispectral image that has not been tone-mapped and a panchromatic image. As a result, the image processing apparatus 100 can generate a pan-sharpened image in which local brightness contrast derived from the panchromatic image is appropriately reflected.
  • the image processing apparatus 100 reduces the amount of data handled in the tone mapping process by executing the tone mapping process on the low-resolution multispectral image “M”. As a result, the image processing apparatus 100 creates a pan-sharpened image having an appropriate gradation subjected to the tone mapping process at high speed.
  • the image processing apparatus 100 extracts local features derived from a panchromatic image and multispectral images extracted based on the result of luminance matching between the panchromatic image and the multispectral image.
  • a pan-sharpening process is executed using the extracted global features.
  • the image processing apparatus 100 uses the original (non-tone mapped) multispectral image in luminance matching for obtaining local features.
  • the image processing apparatus 100 uses luminance information of a low-resolution multispectral image when compressing the luminance range of the global feature (global luminance change) by tone mapping. As a result, the image processing apparatus 100 can reduce the calculation cost (computation amount) as compared with the case where the tone mapping process is executed after the high-resolution pan-sharpened image is created.
  • a high-resolution image for example, panning
  • a plurality of images for example, panchromatic image and multispectral image
  • “Sharpened image) can be generated efficiently.
  • the image processing apparatus 100 does not necessarily need to use the luminance ratio between the pixel value of a specific pixel and the pixel value of the smoothed image as exemplified in the above formula (6) as the local feature amount.
  • the image processing apparatus 100 can represent the brightness value of a pan-sharpened image in the form of a product of local features derived from a high-resolution panchromatic image and global features derived from a low-resolution multispectral image. Any possible process may be selected.
  • the multispectral image is appropriately color-separated (decomposed into a hue component and a luminance component), the luminance component is replaced with the luminance component of the panchromatic image, and then the color synthesis (hue component) And a luminance component are known.
  • the image processing apparatus 100 may employ such pan-sharpening processing.
  • the image processing apparatus 100 may use the brightness of the up-sampled multispectral image instead of the brightness of the smoothed image as the denominator of Expression (6).
  • the brightness of the up-sampled multispectral image is multiplied by the local feature “ ⁇ ”, so that the brightness information of the panchromatic image is used as it is as the brightness information of the pan-sharpened image. .
  • the image processing apparatus 100 may use a color space (for example, HCS method) in which the value of each band (spectral band) in the multispectral image is proportional to the luminance.
  • the image processing apparatus 100 may convert the color space of the multispectral image provided from the input apparatus 200 into the color space as described above. In this case, it is possible to simplify the color separation and composition processing in the image composition unit 10 7.
  • FIG. 4 is a block diagram illustrating a functional configuration of an image processing apparatus 400 according to the second embodiment of the present invention.
  • the image processing apparatus 400 includes a gradation adjustment image generation unit (gradation adjustment image generation unit) 401 and a composite image generation unit (synthesis image generation unit) 402.
  • the gradation adjustment image generation unit 401 and the composite image generation unit 402 may be communicably connected using an arbitrary communication method.
  • components constituting the image processing apparatus 400 will be described.
  • the gradation adjustment image generation unit 401 generates a first gradation conversion image by converting the gradation of the first image having one or more color information.
  • the first image may be, for example, a multispectral image.
  • the gradation adjustment image generation unit 401 may convert the luminance gradation of each pixel included in the first image (for example, a multispectral image).
  • the gradation adjusted image generation unit 401 may perform range compression on the gradation of the first image, for example, by executing tone mapping processing on the first image.
  • the gradation adjustment image generation unit 401 may convert the gradation of the first image and the resolution of the first image in which the gradation is converted.
  • the tone adjustment image generation unit 401 as described above may correspond to, for example, the luminance extraction unit 102, the tone mapping processing unit 103, the enlargement processing unit 104, and the like in the first embodiment. It is not limited.
  • the composite image generation unit 402 uses the feature information regarding the gradation of the second image having a higher resolution than that of the first image and the gradation of the first gradation conversion image, and uses the first image. Adjust. Based on the result, the composite image generation unit 402 generates a composite image having one or more pieces of color information and having a resolution higher than that of the first image.
  • the second image may be, for example, a panchromatic image having a higher resolution than the multispectral image.
  • the feature information may be information representing a local feature in the second image generated based on the first image and the second image, for example.
  • the feature information may correspond to, for example, local feature information (local feature amount) in the first embodiment.
  • the composite image may be the pan-sharpened image in the first embodiment.
  • the composite image generation unit 402 converts, for example, the second gradation obtained by converting the gradation (for example, luminance gradation) of the second image in accordance with the gradation of the first image.
  • Information representing local features in the converted image may be extracted as the feature information.
  • the feature information may be information having the same resolution as that of the second image, for example.
  • the composite image generation unit 402 may convert the resolution of the first image, for example.
  • the composite image generation unit 402 uses, for example, the feature information and the first gradation conversion information to adjust the gradation of the first image in which the resolution is converted.
  • the composite image may be generated.
  • the composite image generation unit 402 may correspond to, for example, the local feature extraction unit 105, the output luminance calculation unit 106, and the image synthesis unit 107 in the first embodiment, but is not limited thereto. .
  • the image processing apparatus 400 configured as described above generates, for example, a first tone conversion image by executing tone conversion processing such as tone mapping processing on the first image.
  • tone conversion processing such as tone mapping processing
  • the first image may be a low-resolution multispectral image, the calculation cost required for the gradation conversion process is relatively low.
  • the image processing apparatus 400 adjusts the first image by using the feature information relating to the gradation of the second image and the gradation of the first gradation conversion image, thereby adjusting the composite image.
  • the second image may be a panchromatic image, for example.
  • the image processing device 400 generates a composite image having an appropriate gradation based on the gradation (for example, luminance gradation) of the first image (for example, multispectral image) or the second image (for example, panchromatic image).
  • the image processing apparatus 400 for example, based on the feature information included in the high-resolution second image and the original first image before tone conversion such as tone mapping is performed. It is because it produces
  • the image processing apparatus 400 adjusts the first image based on the feature information and the gradation of the first gradation conversion image.
  • the image processing apparatus 400 does not need to execute gradation conversion processing that directly targets an image with a high resolution, the calculation cost required for generating a composite image can be reduced. That is, the image processing apparatus 400 can efficiently generate a composite image.
  • the image processing apparatus 400 uses a plurality of images with different resolutions (for example, the first image and the second image) to generate a high-resolution image (for example, an appropriate gradation) (for example, The above composite image) can be generated efficiently.
  • a high-resolution image for example, an appropriate gradation
  • the image processing apparatus 400 may further include a feature information generation unit (feature information generation means) 403 as illustrated in FIG.
  • a feature information generation unit feature information generation means 403 as illustrated in FIG.
  • the gradation adjustment image generation unit 401 and the composite image generation unit 402 may be the same as those in the second embodiment.
  • the feature information generation unit 403 extracts the gradations of the first and second images, and converts the gradation of the second image according to the gradation range of the first image, thereby converting the first image and the second image. 2 gradation conversion images are generated.
  • the feature information generation unit 403 extracts the degree of local gradation change in the generated second gradation conversion image as the feature information.
  • the feature information generation unit 403 may correspond to, for example, the local feature extraction unit 105 in the first embodiment.
  • the composite image generation unit 402 may generate the composite image using the feature information generated by the feature information generation unit 403.
  • the feature information generation unit 403 in the present modification appropriately converts the gradation of the second image in accordance with the gradation range of the first image, for example, by executing a luminance matching process or the like. 2 gradation conversion images can be generated. This is because the first image has not been subjected to gradation conversion processing such as tone mapping, and therefore, for example, the correlation between the gradation of the first image and the gradation of the second image This is because it is maintained.
  • the feature information generation unit 403 in the present modification extracts the degree of local gradation change in the second gradation conversion image as the feature information.
  • the composite image generation unit 402 obtains feature information representing local features of the high-resolution second tone-converted image and the first tone-converted image whose tone is converted (compressed) by tone mapping or the like. By using this, it is possible to generate high-resolution luminance information (for example, the output luminance adjustment information). Then, the composite image generation unit 402 generates the composite image based on the high-resolution luminance information and the first image.
  • the image processing apparatus 400 according to the present modification can generate a composite image having an appropriate gradation based on the gradation (for example, the luminance gradation) of the multispectral image and the panchromatic image.
  • the image processing apparatus 400 in the present modification has the same configuration as that of the second embodiment, the same effect as that of the image processing apparatus 400 in the second embodiment can be obtained.
  • image processing apparatuses (100, 400) described in the above embodiments are collectively referred to simply as “image processing apparatus”.
  • Each component of the image processing apparatus is simply referred to as “component of the image processing apparatus”.
  • each of the above embodiments may be configured by one or a plurality of dedicated hardware devices.
  • each component shown in each of the above figures uses hardware (an integrated circuit or a storage device on which processing logic is mounted) that is partially or fully integrated. May be realized.
  • the components of the image processing apparatus include, for example, a circuit mechanism that can provide each function (for example, an integrated circuit such as a SoC (System on a Chip)). May be implemented.
  • the data held by the components of the image processing apparatus is, for example, a RAM (Random Access Memory) area integrated as SoC, a flash memory area, or a storage device (such as a semiconductor storage device) connected to the SoC. May be stored.
  • a well-known communication bus may be employed as a communication line that connects each component of the image processing apparatus. Further, the communication line connecting each component is not limited to bus connection, and each component may be connected by peer-to-peer.
  • each hardware device may be communicably connected by any communication means (wired, wireless, or a combination thereof).
  • the above-described image processing apparatus or its constituent elements may be configured by general-purpose hardware as illustrated in FIG. 6 and various software programs (computer programs) executed by the hardware.
  • the image processing apparatus may be configured by an arbitrary number of general-purpose hardware devices and software programs. That is, an individual hardware device may be assigned to each component constituting the image processing apparatus, and a plurality of components may be realized using one hardware device.
  • the arithmetic device 601 in FIG. 6 is an arithmetic processing device such as a general-purpose CPU (Central Processing Unit) or a microprocessor.
  • the arithmetic device 601 may read various software programs stored in a nonvolatile storage device 603, which will be described later, into the storage device 602, and execute processing according to the software programs.
  • the components of the image processing apparatus in each of the above embodiments may be realized as a software program executed by the arithmetic device 601.
  • the storage device 602 is a memory device such as a RAM that can be referred to from the arithmetic device 601, and stores software programs, various data, and the like. Note that the storage device 602 may be a volatile memory device.
  • the non-volatile storage device 603 is a non-volatile storage device such as a magnetic disk drive or a semiconductor storage device using flash memory.
  • the nonvolatile storage device 603 can store various software programs, data, and the like.
  • the network interface 606 is an interface device connected to a communication network.
  • a wired and wireless LAN (Local Area Network) connection interface device, a SAN connection interface device, or the like may be employed.
  • the drive device 604 is, for example, a device that processes reading and writing of data with respect to a recording medium 605 described later.
  • the recording medium 605 is an arbitrary recording medium capable of recording data, such as an optical disk, a magneto-optical disk, and a semiconductor flash memory.
  • the input / output interface 607 is a device that controls input / output with an external device.
  • the image processing apparatus (or its constituent elements) according to the present invention described with the above-described embodiments as an example can realize the functions described in the above-described embodiments with respect to, for example, the hardware apparatus illustrated in FIG. It may be realized by supplying a software program. More specifically, for example, the present invention may be realized by causing the arithmetic device 601 to execute a software program supplied to such a device. In this case, an operating system running on the hardware device, database management software, network software, middleware such as a virtual environment platform, etc. may execute part of each process.
  • the software program may be recorded on the recording medium 605.
  • the software program may be configured to be stored in the nonvolatile storage device 603 through the drive device 604 as appropriate at the shipping stage or the operation stage of the image processing apparatus.
  • the method of supplying various software programs to the hardware is installed in the apparatus using an appropriate jig in the manufacturing stage before shipment or the maintenance stage after shipment.
  • a method may be adopted.
  • a method for supplying various software programs a general procedure may be adopted at present, such as a method of downloading from the outside via a communication line such as the Internet.
  • the present invention can be understood to be constituted by a code constituting the software program or a computer-readable recording medium on which the code is recorded.
  • the recording medium is not limited to a medium independent of the hardware device, but includes a recording medium in which a software program transmitted via a LAN or the Internet is downloaded and stored or temporarily stored.
  • the above-described image processing apparatus may be configured by a virtual environment obtained by virtualizing the hardware device illustrated in FIG. 6 and various software programs (computer programs) executed in the virtual environment.
  • the components of the hardware device illustrated in FIG. 6 are provided as virtual devices in the virtual environment.
  • the present invention can be realized with the same configuration as when the hardware device illustrated in FIG. 6 is configured as a physical device.

Abstract

The purpose of the present invention is to efficiently generate a pansharpened image which has an appropriate tonal range and whereupon tone mapping processing has been carried out. Provided is an image processing device, comprising a tonal range-adjusted image generating unit which generates a first tonal range-converted image by converting the tonal range of a first image having one or more instances of color information. The image processing device further comprises a composite image generating means which generates a composite image which is an image which has one or more instances of color information and a resolution which is at least higher than the first image, by adjusting the first image using characteristic information generated on the basis of the first image and a second image which has a higher resolution than the first image and relating to the tonal range of the second image, and the tonal range of the first tonal range-converted image.

Description

画像処理装置、画像処理方法、及び、画像処理プログラムが記憶された記録媒体Image processing apparatus, image processing method, and recording medium storing image processing program
 本発明は、解像度が異なる複数の画像データを用いて、高解像度の画像データを生成する技術に関する。 The present invention relates to a technique for generating high-resolution image data using a plurality of image data having different resolutions.
 高い解像度を有する画像と、低い解像度を有する画像とから、高解像度のマルチスペクトル画像を求める技術が知られている。例えば、高解像度画像(衛星画像等)に対する視認性を向上可能な技術として、高解像度の画像であるパンクロマチック画像と低解像度の画像であるマルチスペクトル画像とを用いて高解像度マルチスペクトル画像を作成する処理(パンシャープン処理)が知られている。 A technique for obtaining a high-resolution multispectral image from an image having a high resolution and an image having a low resolution is known. For example, as a technology that can improve the visibility of high-resolution images (satellite images, etc.), a high-resolution multispectral image is created using a panchromatic image that is a high-resolution image and a multispectral image that is a low-resolution image. Processing to perform (pan sharpening processing) is known.
 また、一般に、高階調な画像データ(例えば、衛星画像等)を、低階調な表示系(例えば、グラフィックボード、ディスプレイ、プリンタ等)に出力する際、表示レンジが圧縮される。これに伴い、表示結果において階調情報が失われる場合がある。これに対して、予め、高階調の画像データにトーンマッピング処理を施してレンジ変換(例えばレンジ圧縮等)を行うことにより、表示結果の情報損失を軽減する技術が知られている。 In general, when high gradation image data (for example, satellite image) is output to a low gradation display system (for example, graphic board, display, printer, etc.), the display range is compressed. As a result, gradation information may be lost in the display result. On the other hand, there is known a technique for reducing information loss in a display result by performing tone conversion processing on high tone image data in advance and performing range conversion (for example, range compression).
 上記したパンシャープン処理、あるいは、トーンマッピング処理に関連する技術が、以下の特許文献に開示されている。 Techniques related to the above-described pan-sharpening process or tone mapping process are disclosed in the following patent documents.
 特許文献1(国際公開第2015/037189号)は、パンシャープン処理により生成される画像におけるノイズ除去に関する技術を開示する。特許文献1に開示された技術は、高解像度画像を、低解像度画像に相当する解像度まで多重解像分解する。係る技術は、分解した画像を、低解像度画像を用いて補正することにより、ノイズを低減する。 Patent Document 1 (International Publication No. 2015/037189) discloses a technique related to noise removal in an image generated by pan-sharpening processing. The technique disclosed in Patent Document 1 performs multiple resolution decomposition of a high-resolution image to a resolution corresponding to the low-resolution image. Such a technique reduces noise by correcting the decomposed image using a low-resolution image.
 特許文献2(特開2014-174817号公報)は、互いに部分的に重複する複数の高階調画像をトーンマッピング処理する際に、各画像の輝度値の分布に基づいて、それぞれの画像の輝度補正に用いられる係数を決定する技術を開示する。 Patent Document 2 (Japanese Patent Application Laid-Open No. 2014-174817) discloses that when tone mapping processing is performed on a plurality of high-tone images partially overlapping each other, luminance correction of each image is performed based on the luminance value distribution of each image. Disclosed is a technique for determining coefficients used in the above.
 特許文献3(特開2013-109759号公報)は、辞書学習を用いたパンシャープン処理に関する技術を開示する。特許文献3に開示された技術は、パンクロマチック画像とマルチスペクトル画像とから特徴量ベクトルを抽出し、当該特徴量ベクトルを、欠落値を有さないベクトルと、欠落値を有するベクトルとに分解する。特許文献3に開示された技術は、欠落値を有さないベクトルを用いて学習した辞書に基づいて、欠落値(高解像度画像における色情報)を予測する。 Japanese Patent Laid-Open No. 2013-109759 discloses a technique related to pan-sharpening processing using dictionary learning. The technique disclosed in Patent Document 3 extracts a feature vector from a panchromatic image and a multispectral image, and decomposes the feature vector into a vector having no missing value and a vector having a missing value. . The technique disclosed in Patent Literature 3 predicts a missing value (color information in a high-resolution image) based on a dictionary learned using a vector having no missing value.
 特許文献4(特開2011-100426号公報)は、衛星から撮像したパンクロマチック画像と、当該パンクロマチック画像に対して重ね合わせ可能に幾何変換したカラー(マルチスペクトル)画像とを用いて、パンシャープン処理を実行する技術を開示する。 Patent Document 4 (Japanese Patent Application Laid-Open No. 2011-1000042) uses a panchromatic image captured from a satellite and a color (multispectral) image that is geometrically transformed so as to be superposed on the panchromatic image. Disclosed is a technique for executing the process.
 特許文献5(特開2015-033582号公報)は、X線コンピュータ断層撮像装置において、光子計数検出が検出した疎スペクトルデータと、エネルギー積分器が検出した密パンクロマティックデータとを用いてパンシャープン処理を実行する技術を開示する。 Japanese Patent Laid-Open No. 2015-033582 discloses pan sharpening using sparse spectral data detected by photon counting detection and dense panchromatic data detected by an energy integrator in an X-ray computed tomography apparatus. A technique for executing processing is disclosed.
 特許文献6(米国特許第8737733号明細書)は、パンシャープン画像の色味を改善するため、パンクロマチック画像の輝度値と、マルチスペクトル画像の輝度値とが近い分布になるように、統計モデルを用いて変換する技術を開示する。 Patent Document 6 (U.S. Pat. No. 8,737,733) discloses a statistical technique in which the luminance value of a panchromatic image and the luminance value of a multispectral image are close to each other in order to improve the color of a pan-sharpened image. A technique for converting using a model is disclosed.
 また、非特許文献1は、高階調画像をベースレイヤとディテールレイヤに分解し、描写されたシーンに合わせてレンジ圧縮を行うことで、情報の損失を軽減する技術を開示する。 Also, Non-Patent Document 1 discloses a technique for reducing information loss by decomposing a high gradation image into a base layer and a detail layer and performing range compression in accordance with the depicted scene.
国際公開第2015/037189号International Publication No. 2015/037189 特開2014-174817号公報JP 2014-174817 A 特開2013-109759号公報JP 2013-109759 A 特開2011-100426号公報JP 2011-1000042 特開2015-033582号公報Japanese Patent Laying-Open No. 2015-033582 米国特許第8737733号明細書U.S. Pat. No. 8,737,733
 画像データにおける局所的な情報損失を低減するように、レンジ圧縮率を局所的に制御するトーンマッピング処理は、計算コストが高い。即ち、このような処理を高解像度であるパンクロマチック画像やパンシャープン画像に適用すると、処理負荷が高いという問題がある。また、トーンマッピング処理とパンシャープン処理とを実行する場合、元のマルチスペクトル画像あるいはパンクロマチック画像の階調(例えば輝度の階調)に基づいた、適切な階調を有する高解像度画像を生成することが求められる。上記各特許文献は、いずれもこのような課題を直接的に解決可能な技術ではない。 Tone mapping processing that locally controls the range compression rate so as to reduce local information loss in image data has a high calculation cost. That is, when such processing is applied to a high-resolution panchromatic image or pan-sharpened image, there is a problem that the processing load is high. When tone mapping and pan-sharpening are executed, a high-resolution image with appropriate gradation is generated based on the gradation (for example, luminance gradation) of the original multispectral image or panchromatic image. It is required to do. None of the above patent documents is a technology that can directly solve such problems.
 本発明は、上記のような事情を鑑みてなされたものである。即ち、本発明は、解像度が異なる複数の画像を用いて、適切な階調を有する高解像度の画像を効率的に生成可能な技術を提供することを、主たる目的の一つとする。 The present invention has been made in view of the above circumstances. That is, the main object of the present invention is to provide a technology capable of efficiently generating a high-resolution image having an appropriate gradation using a plurality of images having different resolutions.
 上記の目的を達成すべく、本発明の一態様に係る画像処理装置は、以下の構成を備える。即ち、本発明の一態様に係る画像処理装置は、1以上の色情報を有する第1の画像の階調を変換することにより第1の階調変換画像を生成する階調調整画像生成部と、上記第1の画像と、上記第1の画像より解像度が高い第2の画像と、に基づいて生成された、上記第2の画像の階調に関する特徴情報と、上記第1の階調変換画像の階調とを用いて、上記第1の画像を調整することにより、1以上の色情報を有し、少なくとも上記第1の画像よりも解像度が高い画像である合成画像を生成する合成画像生成部と、を備える。 In order to achieve the above object, an image processing apparatus according to an aspect of the present invention has the following arrangement. That is, an image processing apparatus according to an aspect of the present invention includes a gradation adjustment image generation unit that generates a first gradation conversion image by converting gradations of a first image having one or more color information. , Feature information relating to the gradation of the second image generated based on the first image and a second image having a higher resolution than the first image, and the first gradation conversion By adjusting the first image using the gradation of the image, a composite image that generates one or more color information and generates a composite image that is at least a higher resolution than the first image A generating unit.
 また、本発明の一態様に係る画像処理方法は、1以上の色情報を有する第1の画像に含まれる画素の階調を変換することにより第1の階調変換画像を生成し、上記第1の画像と、上記第1の画像より解像度が高い第2の画像と、に基づいて生成された、上記第2の画像の階調に関する特徴情報と、上記第1の階調変換画像及び上記第1の画像と、を用いて、1以上の色情報を有し、少なくとも上記第1の画像よりも解像度が高い画像である合成画像を生成する。 The image processing method according to one embodiment of the present invention generates a first gradation-converted image by converting gradations of pixels included in a first image having one or more color information, and Characteristic information about the gradation of the second image, the first gradation-converted image, and the second gradation image generated based on the first image and the second image having a higher resolution than the first image, The first image is used to generate a composite image that has one or more pieces of color information and is at least a higher resolution than the first image.
 また、同目的は、上記構成を有する画像処理装置、あるいは、画像処理方法を、コンピュータによって実現するコンピュータ・プログラム、及び、そのコンピュータ・プログラムが格納されている、コンピュータ読み取り可能な記録媒体等によっても達成される。 The object can also be achieved by an image processing apparatus having the above configuration or a computer program for realizing the image processing method by a computer, and a computer-readable recording medium in which the computer program is stored. Achieved.
 本発明によれば、解像度が異なる複数の画像を用いて、適切な階調を有する高解像度の画像を効率的に生成することが可能である。 According to the present invention, it is possible to efficiently generate a high-resolution image having an appropriate gradation using a plurality of images having different resolutions.
図1は、本発明の第1の実施形態における画像処理装置の機能的な構成を例示するブロック図である。FIG. 1 is a block diagram illustrating a functional configuration of an image processing apparatus according to the first embodiment of the present invention. 図2は、パンクロマチック画像と同じ解像度を有する画像を模式的に表す説明図である。FIG. 2 is an explanatory diagram schematically showing an image having the same resolution as the panchromatic image. 図3は、本発明の第1の実施形態における画像処理装置の動作の具体例を示すフローチャートである。FIG. 3 is a flowchart showing a specific example of the operation of the image processing apparatus according to the first embodiment of the present invention. 図4は、本発明の第2の実施形態に係る画像処理装置の機能的な構成を例示するブロック図である。FIG. 4 is a block diagram illustrating a functional configuration of an image processing apparatus according to the second embodiment of the present invention. 図5は、本発明の第2の実施形態の変形例に係る画像処理装置の機能的な構成を例示するブロック図である。FIG. 5 is a block diagram illustrating a functional configuration of an image processing apparatus according to a modification of the second embodiment of the present invention. 図6は、本発明の各実施形態における画像処理装置を実現可能なハードウェア装置の構成を例示する図面である。FIG. 6 is a diagram illustrating a configuration of a hardware device capable of realizing the image processing apparatus according to each embodiment of the present invention. 図7は、トーンマッピング前後のマルチスペクトル画像の輝度、及び、パンクロマチック画像の輝度を正規化した場合の累積ヒストグラムの一例を、模式的に表す図である。FIG. 7 is a diagram schematically illustrating an example of a cumulative histogram when the luminance of the multispectral image before and after tone mapping and the luminance of the panchromatic image are normalized.
 本発明の実施形態に関する説明に先立って、本発明に関する技術的な背景等についてより詳細に説明する。以下においては、画像データの例として、衛星から撮像した衛星画像データを用いて説明するが、本発明はこれには限定されない。 Prior to the description of the embodiment of the present invention, the technical background related to the present invention will be described in more detail. In the following, description will be made using satellite image data taken from a satellite as an example of image data, but the present invention is not limited to this.
 人間が画像(画像データ)を視認した結果に基づいて、画像に描写された対象の認識や分類を行う技術分野(例えば画像判読等)がある。係る技術分野においては、描写内容が物理的に正しい画像よりも、表示された際の視認性が高い画像を生成することが求められる場合がある。 There is a technical field (for example, image interpretation) that recognizes and classifies an object depicted in an image based on the result of a human viewing an image (image data). In such a technical field, it may be required to generate an image with high visibility when displayed, rather than an image whose description is physically correct.
 例えば、画像における空間方向及び波長方向の視認性の向上(改善)を目的とした技術として、高解像度であるパンクロマチック画像と低解像度であるマルチスペクトル画像から高解像度マルチスペクトル画像を作成するパンシャープン処理がある。係るパンシャープン処理は、例えば、人工衛星において撮像された衛星画像の視認性を改善する処理として用いられる。 For example, as a technology aimed at improving (improving) the visibility in the spatial direction and wavelength direction of an image, pan-sharp that creates a high-resolution multispectral image from a high-resolution panchromatic image and a low-resolution multispectral image There is a process. Such pan-sharpening processing is used, for example, as processing for improving the visibility of a satellite image captured by an artificial satellite.
 ここで、パンクロマチック画像は、例えば、空間解像度が高いものの、単色(モノクロ)で構成された画像データである。また、マルチスペクトル画像は、空間解像度がパンクロマチック画像に比べて低いものの、複数のスペクトル情報(例えば、可視光領域における色情報)を含む画像データである。パンクロマチック画像に撮像された内容と、マルチスペクトル画像に撮像された内容とは、同一か略同一であってもよい。また、一方の画像に撮像された内容と、他方の画像に撮像された内容との対応関係を判定可能であれば、それぞれの画像に撮像される内容は同一でなくてもよい。なお、これらの画像を撮像(あるいは生成)する方法(例えば、マルチスペクトルセンサ、及び、パンクロマチックセンサを用いた撮像等)としては、周知技術を採用可能である。 Here, the panchromatic image is, for example, image data composed of a single color (monochrome) with a high spatial resolution. The multispectral image is image data including a plurality of pieces of spectral information (for example, color information in the visible light region), although the spatial resolution is lower than that of the panchromatic image. The content captured in the panchromatic image and the content captured in the multispectral image may be the same or substantially the same. In addition, as long as it is possible to determine the correspondence between the content captured in one image and the content captured in the other image, the content captured in each image may not be the same. As a method for capturing (or generating) these images (for example, imaging using a multispectral sensor and a panchromatic sensor), a well-known technique can be employed.
 一方、画像における階調について考慮すると、画像データは、表示系(例えば、グラフィックボード、ディスプレイ、プリンタ等)が出力可能な階調よりも高い階調を有する場合がある。例えば、一般的に、人工衛星において撮像された衛星画像は、表示系が出力可能な階調よりも高い階調を有することが多い。係る高階調データを低階調の表示系に出力する際、表示レンジが圧縮される(階調が潰される)ことから、情報の損失が発生する。係る高階調データの表示に関連して、階調方向のレンジを変換(圧縮)することで、情報の損失を軽減するトーンマッピング処理が知られている。 On the other hand, when considering the gradation in the image, the image data may have a gradation higher than the gradation that can be output by the display system (for example, graphic board, display, printer, etc.). For example, generally, satellite images captured by artificial satellites often have gradations higher than the gradations that can be output by the display system. When such high gradation data is output to a low gradation display system, the display range is compressed (gradation is crushed), and information loss occurs. In relation to the display of such high gradation data, a tone mapping process is known that reduces the loss of information by converting (compressing) the range in the gradation direction.
 しかしながら、一般的に局所的トーンマッピング(Local Tone Mapping)処理と称されるような、画像における局所毎に異なる処理を行うトーンマッピング処理は、計算コストが高い。係るトーンマッピング処理を、高解像度であるパンクロマチック画像やパンシャープン画像に適用すると、処理負荷が高い。 However, a tone mapping process that performs different processing for each local area in an image, which is generally called a local tone mapping process, has a high calculation cost. When such tone mapping processing is applied to a high-resolution panchromatic image or pan-sharpened image, the processing load is high.
 これに対して、例えば、低解像度であるマルチスペクトル画像にトーンマッピング処理を行ってから、パンシャープン処理を行うことにより、計算コストを低減することが考えられる。 On the other hand, for example, it is conceivable to reduce the calculation cost by performing the tone sharpening process after performing the tone mapping process on the low-resolution multispectral image.
 しかしながら、このような方法を採用した場合、低解像度のマルチスペクトル画像に対するトーンマッピング処理によって、画像中の輝度分布が変化してしまうことから、パンシャープン処理の結果が不適切になる場合がある。 However, when such a method is adopted, the brightness distribution in the image is changed by the tone mapping process for the low-resolution multispectral image, so the result of the pan-sharpening process may be inappropriate. .
 一般的に、パンクロマチック画像の輝度(輝度分布)と、マルチスペクトル画像の輝度(輝度分布)との間には、相関関係がある。即ち、パンクロマチック画像において高輝度である部分は、マルチスペクトル画像においても高輝度であり、その逆も成り立つことが想定されている。これに対して、マルチスペクトル画像にトーンマッピング処理を実行した場合、マルチスペクトル画像の輝度分布は、例えば、ある特定の輝度平均を中心とした一定範囲に圧縮される。即ち、元のマルチスペクトル画像と、トーンマッピング済みのマルチスペクトル画像とは、輝度分布の形状が異なる。結果として、マルチスペクトル画像にトーンマッピング処理を実行した場合、マルチスペクトル画像の輝度と、パンクロマチック画像の輝度との間の相関関係が失われる。 Generally, there is a correlation between the luminance (luminance distribution) of a panchromatic image and the luminance (luminance distribution) of a multispectral image. That is, it is assumed that a portion having high luminance in the panchromatic image has high luminance in the multispectral image and vice versa. On the other hand, when tone mapping processing is performed on a multispectral image, the luminance distribution of the multispectral image is compressed, for example, to a certain range centered on a specific luminance average. That is, the shape of the luminance distribution is different between the original multispectral image and the tone-mapped multispectral image. As a result, when a tone mapping process is performed on a multispectral image, the correlation between the luminance of the multispectral image and the luminance of the panchromatic image is lost.
 ある種のパンシャープン処理においては、パンクロマチック画像とマルチスペクトル画像の輝度とを対応付ける(輝度マッチング)処理が実行される場合がある。上記より、トーンマッピング済みのマルチスペクトル画像の輝度と、パンクロマチック画像の輝度と間では相関関係が失われていることから、適切な輝度マッチングを行うことができない。この結果、予めトーンマッピングを実行した低解像度のマルチスペクトル画像と、パンクロマチック画像とを用いた場合、適切な階調のパンシャープン画像が生成されない可能性がある。 In a certain type of pan-sharpening processing, there is a case where processing for associating the panchromatic image with the luminance of the multispectral image (luminance matching) is executed. As described above, since the correlation is lost between the luminance of the tone-mapped multispectral image and the luminance of the panchromatic image, appropriate luminance matching cannot be performed. As a result, when a low-resolution multispectral image on which tone mapping has been performed in advance and a panchromatic image are used, there is a possibility that a pan-sharpened image having an appropriate gradation may not be generated.
 これに対して、以下の各実施形態を用いて例示する本発明に係る技術は、例えば、トーンマッピング処理をパンシャープン処理に組み込むことにより、トーンマッピング処理において扱われるデータ量を削減可能である。更に、本発明に係る技術は、例えば、パンシャープン処理において高精度な輝度マッチング処理を実現可能である。本発明に係る技術は、トーンマッピング処理を施したパンシャープン画像を効率よく作成可能である。以下、本発明について各実施形態の記載を用いて詳細に説明する。 On the other hand, the technique according to the present invention exemplified using the following embodiments can reduce the amount of data handled in the tone mapping process by incorporating the tone mapping process into the pan sharpening process, for example. . Furthermore, the technology according to the present invention can realize high-precision luminance matching processing in, for example, pan-sharpening processing. The technology according to the present invention can efficiently create a pan-sharpened image subjected to tone mapping processing. Hereinafter, the present invention will be described in detail using the description of each embodiment.
 <第1の実施形態>
 以下、本発明を実施可能な形態について図面を参照して詳細に説明する。以下の各実施形態に記載されている構成は例示であり、本発明の技術範囲はそれらには限定されない。
<First Embodiment>
DESCRIPTION OF EMBODIMENTS Hereinafter, embodiments capable of implementing the present invention will be described in detail with reference to the drawings. The configurations described in the following embodiments are exemplifications, and the technical scope of the present invention is not limited thereto.
 なお、以下において説明する画像処理装置は、単体の装置(物理的あるいは仮想的な装置)を用いて構成されてもよい。また、係る画像処理装置は、複数の装置(物理的あるいは仮想的な装置)を用いて実現されてもよい。即ち、この場合、係る画像処理装置は、複数の装置により構成されるシステムとして実現される。係る画像処理装置を構成する複数の装置の間は、有線、無線、又はそれらを任意に組み合わせた通信ネットワーク(通信回線)により通信可能に接続されてもよい。なお、係る通信ネットワークは、物理的な通信ネットワークであってもよく、仮想的な通信ネットワークであってもよい。 Note that the image processing apparatus described below may be configured using a single apparatus (physical or virtual apparatus). Such an image processing apparatus may be realized using a plurality of apparatuses (physical or virtual apparatuses). That is, in this case, the image processing apparatus is realized as a system including a plurality of apparatuses. A plurality of devices constituting such an image processing device may be communicably connected via a communication network (communication line) that is wired, wireless, or an arbitrary combination thereof. The communication network may be a physical communication network or a virtual communication network.
  [構成]
 次に、図1を参照して本実施形態における画像処理装置100の構成について説明する。図1は、画像処理装置100の機能的な構成を例示するブロック図である。図1中の矢印の方向は、画像データの転送方向の一例を示す表示であり、本実施形態を限定するものではない。
[Constitution]
Next, the configuration of the image processing apparatus 100 according to the present embodiment will be described with reference to FIG. FIG. 1 is a block diagram illustrating a functional configuration of the image processing apparatus 100. The direction of the arrow in FIG. 1 is a display showing an example of the transfer direction of the image data, and does not limit the present embodiment.
 本実施形態における画像処理装置100は、低解像度のマルチスペクトル画像、及び、高解像度のパンクロマチック画像を受け付ける入力装置200と通信可能に接続される。また、画像処理装置100は、当該画像処理装置100により生成されたパンシャープン画像を出力する出力装置300と通信可能に接続される。画像処理装置100、入力装置200、及び、出力装置300の間を接続する通信方法は適宜選択されてよい。 The image processing apparatus 100 according to the present embodiment is communicably connected to an input apparatus 200 that receives a low-resolution multispectral image and a high-resolution panchromatic image. The image processing apparatus 100 is communicably connected to an output apparatus 300 that outputs a pan-sharpened image generated by the image processing apparatus 100. A communication method for connecting the image processing apparatus 100, the input apparatus 200, and the output apparatus 300 may be selected as appropriate.
 次に、画像処理装置100、入力装置200、及び、出力装置300を構成する要素について説明する。 Next, elements constituting the image processing apparatus 100, the input apparatus 200, and the output apparatus 300 will be described.
 まず、画像処理装置100は、拡大処理部101、輝度抽出部102、トーンマッピング処理部103、拡大処理部104、局所的特徴抽出部105、出力輝度算出部106、及び、画像合成部107を含む。画像処理装置100を構成するこれらの構成要素の間は、任意の通信方法を用いて通信可能に接続されていてよい。 First, the image processing apparatus 100 includes an enlargement processing unit 101, a luminance extraction unit 102, a tone mapping processing unit 103, an enlargement processing unit 104, a local feature extraction unit 105, an output luminance calculation unit 106, and an image composition unit 107. . These components constituting the image processing apparatus 100 may be communicably connected using an arbitrary communication method.
 拡大処理部101は、後述する入力装置200からマルチスペクトル画像(「第1の画像」と称する場合がある)を受け付ける。係るマルチスペクトル画像は、例えば、上記したような複数のスペクトル情報(色情報)を含むカラー画像である。マルチスペクトル画像は、画像処理装置100が別途入力装置200から受け付けるパンクロマチック画像(「第2の画像」と称する場合がある)よりも空間解像度が低い画像である。 The enlargement processing unit 101 receives a multispectral image (sometimes referred to as a “first image”) from the input device 200 described later. Such a multispectral image is, for example, a color image including a plurality of pieces of spectral information (color information) as described above. The multispectral image is an image having a lower spatial resolution than a panchromatic image (sometimes referred to as a “second image”) that the image processing apparatus 100 separately receives from the input device 200.
 拡大処理部101は、入力装置200から受け付けたマルチスペクトル画像の解像度を変換する。具体的には、拡大処理部101は、例えば、マルチスペクトル画像の解像度を、入力装置200から受け付けるパンクロマチック画像と同じ解像度になるようアップサンプリングする。画像の解像度を変換する方法(アップサンプリングする方法)としては、周知の方法が採用されてもよい。 The enlargement processing unit 101 converts the resolution of the multispectral image received from the input device 200. Specifically, the enlargement processing unit 101 up-samples, for example, the resolution of the multispectral image so as to be the same resolution as the panchromatic image received from the input device 200. As a method for converting the resolution of an image (upsampling method), a known method may be employed.
 輝度抽出部102は、入力されたマルチスペクトル画像から、当該画像に関する階調情報を抽出する。輝度抽出部102は、例えば、入力されたマルチスペクトル画像の輝度に関する階調情報を抽出してよい。以下、画像の輝度に関する階調情報を、単に「輝度情報」と称する場合がある。輝度抽出部102は、例えば、マルチスペクトル画像を構成する画素毎に輝度(輝度値)を抽出し、それらの集合を輝度情報としてもよい。この場合、係る輝度情報は、例えば、入力されたマルチスペクトル画像に含まれる画素毎に、当該画素の輝度のみを抽出して生成した輝度画像であってもよい。マルチスペクトル画像の輝度のレンジ(例えば量子化ビット数)と、パンクロマチック画像の輝度のレンジとは同じであってもよく、異なっていてもよい。輝度抽出部102は、画像に関する階調情報として、「輝度」以外に、当該画像を構成する画素毎の濃淡を表現可能な他の情報を抽出してもよい。 The luminance extraction unit 102 extracts gradation information related to the image from the input multispectral image. For example, the luminance extraction unit 102 may extract gradation information related to the luminance of the input multispectral image. Hereinafter, the gradation information regarding the luminance of the image may be simply referred to as “luminance information”. The luminance extraction unit 102 may extract the luminance (luminance value) for each pixel constituting the multispectral image, for example, and use the set as luminance information. In this case, the luminance information may be, for example, a luminance image generated by extracting only the luminance of each pixel included in the input multispectral image. The luminance range of the multispectral image (for example, the number of quantization bits) and the luminance range of the panchromatic image may be the same or different. The luminance extraction unit 102 may extract other information that can express the light and shade of each pixel constituting the image, other than “luminance”, as the gradation information regarding the image.
 トーンマッピング処理部103は、輝度抽出部102においてマルチスペクトル画像から抽出された、低解像度な(空間解像度が低い)輝度情報に対してトーンマッピングを実行することにより、輝度のレンジを圧縮する(レンジ圧縮処理)。トーンマッピング処理部103は、例えば、輝度抽出部102において抽出された輝度情報に含まれる輝度値を、ある輝度レンジに収まるように変換(圧縮)することにより、レンジ圧縮処理を実行してもよい。トーンマッピング処理部103は、例えば、輝度抽出部102がマルチスペクトル画像から抽出した輝度画像に対して係るレンジ圧縮処理を実行してもよい。以下、レンジ圧縮処理が実行された輝度情報(輝度画像)を「第1の階調変換画像」と称する場合がある。 The tone mapping processing unit 103 compresses the luminance range by executing tone mapping on the low-resolution (low spatial resolution) luminance information extracted from the multispectral image by the luminance extraction unit 102 (range). Compression process). For example, the tone mapping processing unit 103 may execute the range compression processing by converting (compressing) the luminance value included in the luminance information extracted by the luminance extracting unit 102 so as to be within a certain luminance range. . For example, the tone mapping processing unit 103 may perform range compression processing on the luminance image extracted from the multispectral image by the luminance extraction unit 102. Hereinafter, the luminance information (luminance image) on which the range compression processing has been performed may be referred to as a “first gradation converted image”.
 拡大処理部104は、トーンマッピング処理部103においてレンジ圧縮された輝度情報(上記第1の階調変換画像)の解像度を変換する。拡大処理部104は、上記拡大処理部101と同様、入力装置200から受け付けるパンクロマチック画像と同じ解像度になるように、輝度情報をアップサンプリングしてもよい。 The enlargement processing unit 104 converts the resolution of the luminance information (the first gradation conversion image) that has undergone range compression in the tone mapping processing unit 103. Similar to the enlargement processing unit 101, the enlargement processing unit 104 may upsample the luminance information so as to have the same resolution as the panchromatic image received from the input device 200.
 局所的特徴抽出部105は、入力装置200から入力されたパンクロマチック画像における局所的な輝度の変化を表す情報(輝度変化情報)に基づいて、局所的特徴情報を生成する。係る輝度変化情報は、例えば、輝度の変化の程度を表す値であってもよい。具体的には、局所的特徴抽出部105は、例えば、入力装置200から入力されたパンクロマチック画像の輝度と、マルチスペクトル画像の輝度とのマッチング(輝度マッチング)を行う。そして、局所的特徴抽出部105は、係る輝度マッチングの結果に基づいて、パンクロマチック画像における局所的な輝度変化情報をマルチスペクトル画像の輝度レンジに変換する。係る変換後のパンクロマチック画像を、「第2の階調変換画像」と称する場合がある。局所的特徴抽出部105は、第2の階調変換画像に基づいて、局所的特徴情報を生成する。 The local feature extraction unit 105 generates local feature information based on information (luminance change information) representing a local luminance change in the panchromatic image input from the input device 200. For example, the luminance change information may be a value indicating the degree of change in luminance. Specifically, the local feature extraction unit 105 performs matching (luminance matching) between the luminance of the panchromatic image input from the input device 200 and the luminance of the multispectral image, for example. Then, the local feature extraction unit 105 converts local luminance change information in the panchromatic image into a luminance range of the multispectral image based on the result of the luminance matching. Such a converted panchromatic image may be referred to as a “second gradation converted image”. The local feature extraction unit 105 generates local feature information based on the second tone conversion image.
 出力輝度算出部106は、局所的特徴抽出部105において生成された局所的特徴と、拡大処理部104においてアップサンプリングされた輝度情報(レンジ圧縮済み)と用いて、高解像度の輝度情報を生成する。出力輝度算出部106は、例えば、局所的特徴抽出部105において生成された局所的特徴と、拡大処理部104においてアップサンプリングされた輝度情報とを乗算することにより、係る高解像度の輝度情報を算出してもよい。出力輝度算出部106により生成される高解像度の輝度情報は、パンクロマチック画像の局所的特徴と、輝度レンジが圧縮された大域的特徴とを含む。 The output luminance calculation unit 106 generates high-resolution luminance information using the local features generated by the local feature extraction unit 105 and the luminance information (range compressed) that has been upsampled by the enlargement processing unit 104. . The output luminance calculation unit 106 calculates the high-resolution luminance information by multiplying, for example, the local feature generated by the local feature extraction unit 105 and the luminance information upsampled by the enlargement processing unit 104. May be. The high-resolution luminance information generated by the output luminance calculation unit 106 includes local features of the panchromatic image and global features in which the luminance range is compressed.
 画像合成部107は、拡大処理部101においてアップサンプリングされたマルチスペクトル画像と、出力輝度算出部106において生成された高解像度の輝度情報とに基づいてパンシャープン画像を生成する。画像合成部107は、例えば、拡大処理部101においてアップサンプリングされたマルチスペクトル画像の輝度情報を、出力輝度算出部106において生成された高解像度の輝度情報に置き換えることにより、パンシャープン画像を合成してもよい。この場合、画像合成部107は、拡大処理部101においてアップサンプリングされたマルチスペクトル画像の色情報を用いて、係るパンシャープン画像の色味の成分(色情報)を設定してもよい。以下、パンシャープン画像を「合成画像」と称する場合がある。なお、パンシャープン処理を実行する具体的な方法は、上記に限定されない。 The image composition unit 107 generates a pan-sharpened image based on the multispectral image upsampled by the enlargement processing unit 101 and the high-resolution luminance information generated by the output luminance calculation unit 106. The image synthesis unit 107 synthesizes the pan-sharpened image by replacing the luminance information of the multispectral image upsampled by the enlargement processing unit 101 with the high-resolution luminance information generated by the output luminance calculation unit 106, for example. May be. In this case, the image composition unit 107 may set the color component (color information) of the pan-sharpened image using the color information of the multispectral image upsampled by the enlargement processing unit 101. Hereinafter, the pan-sharpened image may be referred to as a “composite image”. The specific method for executing the pan sharpening process is not limited to the above.
 入力装置200は、マルチスペクトル画像入力部201と、パンクロマチック画像入力部202と、を含む。係る入力装置200は、マルチスペクトル画像及びパンクロマチック画像を撮像する画像生成装置(不図示)と接続されてもよい。また、入力装置200は、係る画像生成装置を含んでもよい。入力装置200は、入力として受け付けたマルチスペクトル画像及びパンクロマチック画像を、画像処理装置100に供給(提供)する。 The input device 200 includes a multispectral image input unit 201 and a panchromatic image input unit 202. The input device 200 may be connected to an image generation device (not shown) that captures a multispectral image and a panchromatic image. The input device 200 may include such an image generation device. The input device 200 supplies (provides) the multispectral image and panchromatic image received as input to the image processing device 100.
 マルチスペクトル画像入力部201は、例えば、マルチスペクトルセンサ等を用いて撮像されたマルチスペクトル画像を入力として受け付け、係るマルチスペクトル画像を画像処理装置100に供給する。係るマルチスペクトル画像は、例えば、画像処理装置100における拡大処理部101、及び、輝度抽出部102に送られてもよい。 The multispectral image input unit 201 receives, as an input, a multispectral image captured using, for example, a multispectral sensor, and supplies the multispectral image to the image processing apparatus 100. Such a multispectral image may be sent to the enlargement processing unit 101 and the luminance extraction unit 102 in the image processing apparatus 100, for example.
 パンクロマチック画像入力部202は、例えば、パンクロマチックセンサ等を用いて撮像されたパンクロマチック画像を入力として受け付け、係るパンクロマチック画像を画像処理装置100に供給する。係るパンクロマチック画像は、例えば、画像処理装置100における局所的特徴抽出部105に送られてもよい。 The panchromatic image input unit 202 receives, for example, a panchromatic image captured using a panchromatic sensor or the like, and supplies the panchromatic image to the image processing apparatus 100. Such a panchromatic image may be sent to the local feature extraction unit 105 in the image processing apparatus 100, for example.
 出力装置300は、画像処理装置100において生成されたパンシャープン画像を受け付け、図示しない表示系(表示装置等)に出力する。出力装置300は、例えば、受け付けたパンシャープン画像を、コンピュータ等の情報処理装置におけるグラフィック処理デバイス、各種ディスプレイ装置、プリンタ等の表示系に出力してもよい。出力装置300は、係る表示系を含んでもよい。 The output device 300 receives the pan-sharpened image generated by the image processing device 100 and outputs it to a display system (display device or the like) not shown. The output device 300 may output the received pan-sharpened image to a display system such as a graphic processing device, various display devices, or a printer in an information processing device such as a computer. The output device 300 may include such a display system.
  [動作]
 次に、図1及び図3を参照して、本実施形態における画像処理装置100の動作について詳細に説明する。図3は、本実施形態における画像処理装置100の動作を例示するフローチャートである。図3に例示するフローチャートは、一つの具体例であり、各処理の順序は、処理結果に影響を与えない範囲で適宜入れ替え可能である。また、図3に例示するフローチャートの各処理は、処理結果に影響を与えない範囲で、並行して実行されてもよい。
[Operation]
Next, the operation of the image processing apparatus 100 according to the present embodiment will be described in detail with reference to FIGS. FIG. 3 is a flowchart illustrating the operation of the image processing apparatus 100 according to this embodiment. The flowchart illustrated in FIG. 3 is one specific example, and the order of each process can be appropriately changed within a range that does not affect the process result. In addition, each process of the flowchart illustrated in FIG. 3 may be executed in parallel as long as the process result is not affected.
 まず、入力装置200が入力をして受け付けた低解像度マルチスペクトル画像と高解像度パンクロマチック画像とが、画像処理装置100に提供される(ステップS3001)。以下、係るマルチスペクトル画像を”M”、パンクロマチック画像を”P”と表す場合がある。 First, the low-resolution multispectral image and the high-resolution panchromatic image received by the input device 200 are provided to the image processing device 100 (step S3001). Hereinafter, such a multispectral image may be represented as “M” and a panchromatic image may be represented as “P”.
 次に、画像処理装置100(特には拡大処理部101)が、提供されたマルチスペクトル画像の解像度を変換する(ステップS3002)。拡大処理部101は、例えば、バイリニア法等を用いることにより、マルチスペクトル画像を、パンクロマチック画像と同じ解像度までアップサンプリングしてもよい。なお、拡大処理部101はバイリニア法以外の方法(たとえば、バイキュービック補完法等)を適宜用いてよく、アップサンプリングの方法は特定の手法には限定されない。以下、アップサンプリングされたマルチスペクトル画像を”U(M)”と表す場合がある。なお、ステップS3002の処理と、以下において説明するステップS3003乃至ステップS3007の処理とは、並行して実行されてもよい。 Next, the image processing apparatus 100 (particularly the enlargement processing unit 101) converts the resolution of the provided multispectral image (step S3002). The enlargement processing unit 101 may upsample the multispectral image to the same resolution as the panchromatic image by using, for example, a bilinear method. The enlargement processing unit 101 may appropriately use a method other than the bilinear method (for example, a bicubic interpolation method), and the upsampling method is not limited to a specific method. Hereinafter, the up-sampled multispectral image may be represented as “U (M)”. Note that the processing in step S3002 and the processing in steps S3003 to S3007 described below may be executed in parallel.
 次に、画像処理装置100(特には輝度抽出部102)が、マルチスペクトル画像Mから、輝度情報を抽出する(ステップS3003)。輝度抽出部102は、例えば、マルチスペクトル画像Mに対して色空間の変換処理を行うことにより輝度情報”I”を抽出してもよい。係る色空間の変換は、例えば、マルチスペクトル画像”M”の色空間を、HLS(Hue、Lightness、Saturation)、YCbCr、HCS(Hyperspherical Color Space)等の色空間に変換する処理であってもよい。色空間を変換する方法は、特定の手法には限定さない。また、輝度抽出部102は、上記した色空間の変換以外の周知の方法を用いて、輝度情報を抽出してもよい。係る輝度情報”I”は、マルチスペクトル画像Mに含まれる各画素の輝度値を含み、マルチスペクトル画像”M”と同等の解像度を有する。 Next, the image processing apparatus 100 (particularly the luminance extraction unit 102) extracts luminance information from the multispectral image M (step S3003). The luminance extraction unit 102 may extract the luminance information “I M ” by performing color space conversion processing on the multispectral image M, for example. The conversion of the color space may be, for example, a process of converting the color space of the multispectral image “M” into a color space such as HLS (Hue, Lightness, Saturation), YCbCr, HCS (Hyperspherical Color Space). . The method for converting the color space is not limited to a specific method. In addition, the luminance extraction unit 102 may extract the luminance information using a known method other than the color space conversion described above. The luminance information “I M ” includes the luminance value of each pixel included in the multispectral image M and has a resolution equivalent to that of the multispectral image “M”.
 次に、画像処理装置100(特にはトーンマッピング処理部103)は、トーンマッピング処理を実行する(ステップS3004)。以下、係るトーンマッピング処理について具体的に説明する。トーンマッピング処理部103は、輝度抽出部102において抽出されたマルチスペクトル画像”M”の輝度情報”I”を、大域的特徴成分と、局所的特徴成分とに分離(分解)する。大域的特徴成分は、例えば、画像における空間的に滑らかな輝度変化を表し、局所的特徴は、例えば、ある注目画素の周辺における局所的な輝度変化を表す。係る大域的特徴成分の具体例の一つとして、画像に含まれる空間的な照明光を表す情報が用いられてもよい。また、係る局所的特徴成分の具体例の一つとして、画像に含まれる物体の反射率を表す情報が用いられてもよい。一般的に、人間の視覚系に入射される光は、照明光と物体の反射率との積により表され、人間の知覚(視覚)は物体の反射率の影響を受けやすいことが知られている。これより、例えば、照明光の成分を圧縮(レンジ圧縮)することで、目の知覚に影響を与える反射率成分を維持しながら、画像全体の輝度レンジを効果的に圧縮することができる。 Next, the image processing apparatus 100 (in particular, the tone mapping processing unit 103) executes tone mapping processing (step S3004). The tone mapping process will be specifically described below. The tone mapping processing unit 103 separates (decomposes) the luminance information “I M ” of the multispectral image “M” extracted by the luminance extracting unit 102 into a global feature component and a local feature component. The global feature component represents, for example, a spatially smooth luminance change in the image, and the local feature represents, for example, a local luminance change around a certain target pixel. As one specific example of such global feature components, information representing spatial illumination light included in an image may be used. As one specific example of the local feature component, information representing the reflectance of an object included in the image may be used. In general, light incident on the human visual system is represented by the product of the illumination light and the reflectance of the object, and human perception (sight) is known to be sensitive to the reflectance of the object. Yes. Accordingly, for example, by compressing the illumination light component (range compression), it is possible to effectively compress the luminance range of the entire image while maintaining the reflectance component that affects the perception of the eye.
 具体的には、トーンマッピング処理部103は、下式(1)における”I”として、マルチスペクトル画像の輝度情報”I”を入力し、係る”I”を大域的特徴成分”B(I)”と、局所的特徴成分”T(I)”とに分解する。 Specifically, the tone mapping processing unit 103 inputs the luminance information “I M ” of the multispectral image as “I” in the following formula (1), and uses the “I M ” as the global feature component “B ( I) "and the local feature component" T (I) ".
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ただし、上記式(1)において、”xij(I)”は、画像(あるいは輝度情報)”I”の、”i”行”j”列目の画素における画素値(実数)である。係る画素値は、当該画素における輝度値であってもよい。式(1)において、画像”I”は任意の画像(あるいは輝度情報等)であってよく、”B(I)”は当該画像”I”の大域的特徴成分を表し、”T(I)”は当該画像”I”の局所的特徴成分を表す。 However, in the above formula (1), “x ij (I)” is a pixel value (real number) in the pixel in the “i” row “j” column of the image (or luminance information) “I”. The pixel value may be a luminance value in the pixel. In Expression (1), the image “I” may be an arbitrary image (or luminance information, etc.), “B (I)” represents a global feature component of the image “I”, and “T (I) “Represents a local feature component of the image“ I ”.
 具体的には、トーンマッピング処理部103は、例えば、輝度情報”I”にエッジ保存型平滑化フィルタを適用することにより、”I”に関する大域的特徴成分”B(I)”を求めてもよい。画像に含まれる照明光の成分は、物体からの反射率の成分に比べて変化が緩やかであると想定すると、係るエッジ保存型平滑化フィルタを用いることで、係る照明光の成分を抽出可能である。係るエッジ保存型平滑化フィルタは、画像に含まれるエッジ情報を保存するとともに、画像を平均化(平滑化)可能な任意のフィルタであってよい。係るエッジ保存型平滑化フィルタとして、例えば、k最近傍平均化フィルタ、バイラテラルフィルタ、メディアンフィルタ、あるいは、WLSフィルタ(Weighted Least Squares Filter)等が採用されてもよい。エッジ保存型平滑化フィルタを用いることにより、画像の大域的特徴成分を抽出する際にエッジが保存される(エッジが鈍らない)ことから、トーンマッピング処理におけるハロー(Halo)の発生を抑制可能である。 Specifically, the tone mapping processing unit 103 applies the global feature component “B (I M )” regarding “I M ” by applying an edge preserving smoothing filter to the luminance information “I M ”, for example. You may ask for it. Assuming that the illumination light component included in the image changes more slowly than the reflectance component from the object, it is possible to extract the illumination light component by using the edge preserving smoothing filter. is there. The edge preserving smoothing filter may be any filter that preserves edge information included in an image and can average (smooth) the image. As the edge-preserving smoothing filter, for example, a k-nearest neighbor averaging filter, a bilateral filter, a median filter, or a WLS filter (Weighted Led Squares Filter) may be employed. By using an edge-preserving smoothing filter, edges are preserved when the global feature components of the image are extracted (the edges are not dull), so that the occurrence of halo in tone mapping processing can be suppressed. is there.
 また、トーンマッピング処理部103は、局所的特徴成分”T(I)”を、例えば、下式(2)を用いて求めてもよい。式(2)において”I”は任意の画像(あるいは輝度情報)を表す(即ち、式(2)の”I”を”I”に置き換えれば、式(2)から局所的特徴成分”T(I)”が得られる)。式(2)より、輝度情報”I”に含まれるある画素の輝度と、大域的特徴との比(輝度比)から、局所的特徴成分”T(I)”が求められる。式(2)より、局所的特徴成分”T(I)”は、各画素における、大域的特徴成分”B(I)”に対する輝度値の変化の程度を表すと考えることが可能である。例えば、大域的特徴成分”B(I)”がマルチスペクトル画像における照明光成分を表す場合、局所的特徴成分”T(I)”は、物体からの反射率を表すと考えられる。 Further, the tone mapping processing unit 103 may obtain the local feature component “T (I M )” using, for example, the following equation (2). In Expression (2), “I” represents an arbitrary image (or luminance information) (that is, if “I” in Expression (2) is replaced with “I M ”, local feature component “T” from Expression (2). (I M ) "is obtained). From the expression (2), the local feature component “T (I M )” is obtained from the ratio (brightness ratio) between the luminance of a certain pixel included in the luminance information “I M ” and the global feature. From equation (2), it can be considered that the local feature component “T (I M )” represents the degree of change in luminance value with respect to the global feature component “B (I M )” in each pixel. . For example, when the global feature component “B (I M )” represents the illumination light component in the multispectral image, the local feature component “T (I M )” is considered to represent the reflectance from the object.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 次に、トーンマッピング処理部103は、下式(3)に示すように、大域的特徴成分”B(I)”の各要素(画素)”xij(B(I))”に対して、なめらかなレンジ圧縮関数”f”を適用することにより、そのレンジ(画素値のレンジ)を圧縮する。そして、トーンマッピング処理部103は、レンジ圧縮された各要素の画素値と、式(1)により予め分離された局所的特徴成分の各要素”xij(T(I))”の画素値とを掛け合わせる。 Next, as shown in the following formula (3), the tone mapping processing unit 103 applies each element (pixel) “x ij (B (I M ))” of the global feature component “B (I M )”. Then, by applying the smooth range compression function “f”, the range (pixel value range) is compressed. Then, the tone mapping processing unit 103 performs the range-compressed pixel value of each element and the pixel value of each element “x ij (T (I M ))” of the local feature component previously separated by Expression (1). Multiply with.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 トーンマッピング処理部103は、上記のようにして、トーンマッピングを施したマルチスペクトル画像の輝度情報”R(I)”(第1の階調変換画像)を得る。なお、本実施形態においては、圧縮関数”f”として、例えば、下式(4)のような関数が用いられる。式(4)に例示する圧縮関数は、人間の視覚特性を考慮した圧縮関数であり、各画素の輝度値の大小に関わらず視覚的に等価な圧縮を行うことが可能である。トーンマッピング処理部103は、マルチスペクトル画像の各画素の輝度値を式(4)に代入することで、圧縮された輝度値算出する。 The tone mapping processing unit 103 obtains luminance information “R (I M )” (first gradation conversion image) of the multispectral image subjected to tone mapping as described above. In the present embodiment, as the compression function “f”, for example, a function represented by the following expression (4) is used. The compression function illustrated in Expression (4) is a compression function that takes into account human visual characteristics, and can visually compress equivalently regardless of the magnitude of the luminance value of each pixel. The tone mapping processing unit 103 calculates a compressed luminance value by substituting the luminance value of each pixel of the multispectral image into Equation (4).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 なお、トーンマッピング処理部103は、圧縮関数”f”として、式(4)以外の関数を採用してもよい。即ち、トーンマッピングの目的等に応じて、適切な圧縮関数”f”が適宜選択されてよい。 Note that the tone mapping processing unit 103 may adopt a function other than Expression (4) as the compression function “f”. That is, an appropriate compression function “f” may be appropriately selected according to the purpose of tone mapping.
 画像処理装置100(特には拡大処理部104)は、トーンマッピング処理部103においてレンジ圧縮されたマルチスペクトル画像の輝度情報”R(I)”の解像度を変換する(ステップS3005)。具体的には、拡大処理部104は、輝度情報”R(I)”を、パンクロマチック画像”P”と同じ解像度までアップサンプリングする。具体的なアップサンプリング方法は、拡大処理部101と同様としてよい。以下、アップサンプリング後の輝度情報を”UR(I)”と表す。 The image processing apparatus 100 (particularly, the enlargement processing unit 104) converts the resolution of the luminance information “R (I M )” of the multispectral image that has undergone range compression in the tone mapping processing unit 103 (step S3005). Specifically, the enlargement processing unit 104 upsamples the luminance information “R (I M )” to the same resolution as the panchromatic image “P”. A specific upsampling method may be the same as that of the enlargement processing unit 101. Hereinafter, the luminance information after upsampling is expressed as “U o R (I M )”.
 また、画像処理装置100(特には局所的特徴抽出部105)は、輝度抽出部102において抽出された輝度情報”I”と、パンクロマチック画像”P”とから、高解像度な画像に含まる局所的特徴情報(特徴量)を生成する(ステップS3006)。なお、以下に説明するステップS3006の処理は、上記説明したステップS3004、ステップS3005の処理と並行して実行されてもよい。 Further, the image processing apparatus 100 (particularly the local feature extraction unit 105) includes the luminance information “I M ” extracted by the luminance extraction unit 102 and the panchromatic image “P” in a high-resolution image. Local feature information (feature amount) is generated (step S3006). Note that the processing in step S3006 described below may be executed in parallel with the processing in steps S3004 and S3005 described above.
 局所的特徴抽出部105は、例えば、以下のような処理を実行することにより、局所的特徴情報を生成する。局所的特徴抽出部105は、まず、パンクロマチック画像”P”とマルチスペクトル画像の輝度情報”I”との輝度のヒストグラムを用いて、輝度マッチング(ヒストグラム・マッチング)を行う。これにより、局所的特徴抽出部105は、”I”の輝度レンジに合わせて各画素の画素値(輝度値)が変換されたパンクロマチック画像”P^”(「第2の階調変換画像」)を求める。この際、局所的特徴抽出部105に入力されるマルチスペクトル画像”M”はトーンマッピングされていないことから、マルチスペクトル画像の輝度情報”I”の輝度分布と、パンクロマチック画像”P”の輝度分布とは大きく乖離していない。即ち、マルチスペクトル画像の輝度情報”I”の輝度分布と、パンクロマチック画像”P”の輝度分布との間の相関関係が保たれている。よって、局所的特徴抽出部105は、上記輝度マッチングを精度よく実行可能である。 The local feature extraction unit 105 generates local feature information by executing the following processing, for example. First, the local feature extraction unit 105 performs luminance matching (histogram matching) using a luminance histogram of the panchromatic image “P” and the luminance information “I M ” of the multispectral image. Accordingly, the local feature extraction unit 105 performs panchromatic image “P ^” (“second gradation conversion image” in which the pixel value (luminance value) of each pixel is converted in accordance with the luminance range of “I M ”. )). At this time, since the multispectral image “M” input to the local feature extraction unit 105 is not tone-mapped, the luminance distribution of the luminance information “I M ” of the multispectral image and the panchromatic image “P”. It is not significantly different from the luminance distribution. That is, the correlation between the luminance distribution of the luminance information “I M ” of the multispectral image and the luminance distribution of the panchromatic image “P” is maintained. Therefore, the local feature extraction unit 105 can execute the luminance matching with high accuracy.
 ここで、上記トーンマッピング済みのマルチスペクトル画像の輝度情報と、パンクロマチック画像”P”とを精度よく輝度マッチングするは困難である。なぜならば、トーンマッピング済みのマルチスペクトル画像の輝度分布は、元のマルチスペクトル画像の輝度分布と形状が異なることから、パンクロマチック画像”P”の輝度情報と、マルチスペクトル画像”M”の輝度情報との間の相関関係が失われるからである。なお、輝度マッチングの具体的な処理方法としては、周知の方法(例えば、特許文献6に開示された方法や、累積分布関数を用いる方法等)を採用可能である。 Here, it is difficult to accurately perform luminance matching between the luminance information of the multispectral image after tone mapping and the panchromatic image “P”. This is because the luminance distribution of the tone-mapped multispectral image is different in shape from the luminance distribution of the original multispectral image, so that the luminance information of the panchromatic image “P” and the luminance information of the multispectral image “M”. This is because the correlation between the two is lost. As a specific processing method for luminance matching, a known method (for example, a method disclosed in Patent Document 6, a method using a cumulative distribution function, or the like) can be employed.
 次に、局所的特徴抽出部105は、このパンクロマチック画像”P^”に含まれる局所的特徴情報(局所的特徴量)”μ”を算出する。係る局所的特徴情報”μ”は、例えば、このパンクロマチック画像”P^”における局所的な輝度の変化の程度(例えば、注目画素における輝度の変化の程度)を表してもよい。具体的には、係る局所的特徴情報μは、例えば、パンクロマチック画像”P^”に対して平滑化処理を行った結果を”S(P^)”と表した場合に、”P^”におけるある画素の輝度と、”S(P^)”における対応する画素の輝度との輝度比であってもよい。 Next, the local feature extraction unit 105 calculates local feature information (local feature amount) “μ” included in the panchromatic image “P ^”. The local feature information “μ” may represent, for example, the degree of local luminance change in the panchromatic image “P ^” (for example, the degree of luminance change in the target pixel). Specifically, the local feature information μ is, for example, “P ^” when the result of smoothing the panchromatic image “P ^” is expressed as “S (P ^)”. The luminance ratio between the luminance of a certain pixel and the luminance of the corresponding pixel in "S (P ^)" may be used.
 図2及び、下式(5)を参照して、具体的に説明する。図2は、パンクロマチック画像”P”と等しい解像度である画像”I”を模式的に表す図である。係る画像”I”は、例えば、マルチスペクトル画像の輝度情報”I”をパンクロマチック画像の解像度にアップサンプリングした画像であってもよい。この場合、パンクロマチック画像”P”と等しい解像度を有する画像”I”は、低解像度のマルチスペクトル画像の輝度情報”I”の各画素を、”C(I)×C(I)”に細分化した画像と考えられる。 This will be specifically described with reference to FIG. 2 and the following formula (5). FIG. 2 is a diagram schematically showing an image “I” having a resolution equal to that of the panchromatic image “P”. The image “I” may be, for example, an image obtained by up-sampling the luminance information “I M ” of the multispectral image to the resolution of the panchromatic image. In this case, an image “I” having a resolution equal to that of the panchromatic image “P” is obtained by replacing each pixel of luminance information “I M ” of the low-resolution multispectral image with “C L (I) × C P (I)”. It is considered that the image is subdivided into "".
 ”C(I)”は、ピクセル方向(水平方向)における、画像”I”と、パンクロマチック画像”P”とのピクセルスペーシング比である。”C(I)”は、例えば、”((パンクロマチック画像”P”のピクセル方向総ピクセル数)/(細分化される前の画像”I”のピクセル方向総ピクセル数))”により算出される。 “C P (I)” is a pixel spacing ratio between the image “I” and the panchromatic image “P” in the pixel direction (horizontal direction). “C P (I)” is calculated by, for example, “((total number of pixels in the pixel direction of the panchromatic image“ P ”) / (total number of pixels in the pixel direction of the image“ I ”before being subdivided))” ”. Is done.
 ”C(I)”は、ライン方向(垂直方向)における、画像”I”と、パンクロマチック画像”P”とのピクセルスペーシング比である。”C(I)”は、例えば、”((パンクロマチック画像”P”のライン方向総ピクセル数)/(細分化される前の画像”I”のライン方向総ピクセル数))”により算出される。 “C L (I)” is a pixel spacing ratio between the image “I” and the panchromatic image “P” in the line direction (vertical direction). “C L (I)” is calculated by, for example, “((total number of pixels in line direction of panchromatic image“ P ”) / (total number of pixels in line direction of image“ I ”before being subdivided))” ” Is done.
 以下、図2において、画像”I”について、マルチスペクトル画像の輝度情報”I”の”i”行”j”列目の画素に相当する領域内における”m”行”n”列目の画素値を”ymn ij(I)”と表す。 Hereinafter, in FIG. 2, the “m” row “n” column in the region corresponding to the pixel “i” row “j” column of the luminance information “I M ” of the multispectral image for the image “I”. The pixel value is expressed as “y mn ij (I)”.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 パンクロマチック画像”P^”のある画素における輝度”ymn ij(P^)”と、平滑化されたパンクロマチック画像”S(P^)”のある画素における輝度”ymn ij(S(P^))”とを用いて、局所的特徴量”μmn ij”は、下式(6)のように表される。即ち、局所的特徴抽出部105は、式(6)を用いて局所的特徴量”μmn ij”(輝度比)を算出してよい。 The luminance "y mn ij (P ^) " in a certain pixel having a panchromatic image "P ^", the luminance of a pixel of the smoothed panchromatic image "S (P ^)"" y mn ij (S (P ^)) ”And the local feature“ μ mn ij ”is expressed by the following equation (6). That is, the local feature extraction unit 105 may calculate the local feature “μ mn ij ” (luminance ratio) using the equation (6).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 次に、画像処理装置100(特には出力輝度算出部106)は、上記ステップS3005の結果と、上記ステップS3006の結果とに基づいて、局所的特徴情報が反映された、高解像度の輝度情報(出力輝度調整情報)を生成する(ステップS3007)。具体的には、出力輝度算出部106は、局所的特徴抽出部105において得られた局所的特徴量”μmn ij”を、拡大処理部104で得られた高解像度な輝度情報”ymn ij(UR(I))”に乗算した結果を、出力輝度調整情報としてよい。この場合、係る出力輝度調整情報は、”μmn ij*ymn ij(UR(I))”という形式にて表される。”UR(I)”は、上記したように、マルチスペクトル画像”M”の輝度”I”に対してトーンマッピング処理部103により大域的特徴がレンジ圧縮され、更に拡大処理部104によりアップサンプリングされた輝度情報である。これより、パンクロマチック画像”P”に由来する局所的特徴を含むとともに、マルチスペクトル画像”M”に由来する、輝度のレンジが圧縮された大域的特徴を含む、輝度情報(出力輝度調整情報)が得られる。 Next, the image processing apparatus 100 (particularly, the output luminance calculation unit 106), based on the result of step S3005 and the result of step S3006, high-resolution luminance information (local information) is reflected ( Output brightness adjustment information) is generated (step S3007). Specifically, the output luminance calculation unit 106 uses the local feature amount “μ mn ij ” obtained by the local feature extraction unit 105 as high-resolution luminance information “y mn ij ” obtained by the enlargement processing unit 104. The result obtained by multiplying (U o R (I M )) ”may be output luminance adjustment information. In this case, the output luminance adjustment information is expressed in a format of “μ mn ij * y mn ij (U o R (I M ))”. As described above, the global feature of “U R (I M )” is subjected to range compression by the tone mapping processing unit 103 with respect to the luminance “I M ” of the multispectral image “M”. Is the up-sampled luminance information. Thus, luminance information (output luminance adjustment information) including local features derived from the panchromatic image “P” and global features derived from the multispectral image “M” with a compressed luminance range. Is obtained.
 画像処理装置100(特には画像合成部107)は、上記出力輝度調整情報と、拡大処理部101において生成されたアップサンプリングされたマルチスペクトル画像(”U(M)”)とに基づいて、パンシャープン画像を生成する(ステップS3008)。 The image processing apparatus 100 (in particular, the image composition unit 107) performs panning based on the output luminance adjustment information and the upsampled multispectral image (“U (M)”) generated by the enlargement processing unit 101. A sharpened image is generated (step S3008).
 具体的には、画像合成部107は、拡大処理部101においてアップサンプリングされたマルチスペクトル画像”U(M)”に対して色空間の変換を行い、輝度情報と色情報とを分離する。係る色空間の変換は、例えば、ステップS3003における処理と同様としてよい。画像合成部107は、次に、画像”U(M)”において分離された輝度情報を、出力輝度算出部106において生成された出力輝度調整情報”μmn ij*ymn ij(UR(I))”に置き換える。画像合成部107は、置換した輝度情報と、U(M)から分離した色情報とを合成し、当該合成した画像を再び元の色空間に戻す(変換する)。これにより、画像合成部107は、トーンマッピング処理が施されたパンシャープン画像を得る。画像合成部107は、生成されたパンシャープン画像を、出力装置300に提供する。 Specifically, the image composition unit 107 performs color space conversion on the multispectral image “U (M)” upsampled by the enlargement processing unit 101, and separates luminance information and color information. The conversion of the color space may be the same as the process in step S3003, for example. Next, the image composition unit 107 converts the luminance information separated in the image “U (M)” into the output luminance adjustment information “μ mn ij * y mn ij (U o R ( I M )) ”. The image combining unit 107 combines the replaced luminance information and the color information separated from U (M), and returns (converts) the combined image to the original color space again. As a result, the image composition unit 107 obtains a pan-sharpened image on which tone mapping processing has been performed. The image composition unit 107 provides the generated pan-sharpened image to the output device 300.
 出力装置300は画像処理装置100から供給されたパンシャープン画像を出力する(ステップS3009)。 The output device 300 outputs the pan-sharpened image supplied from the image processing device 100 (step S3009).
 以下、出力輝度算出部106が生成した輝度情報(出力輝度調整情報)”μmn ij*ymn ij(UR(I))”が、周知の方法を用いて生成したパンシャープン画像をトーンマッピングした結果を、精度よく近似可能であることを説明する。なお、周知の方法を用いて生成したパンシャープン画像の輝度情報が、”I”と表記され、当該”I”をトーンマッピングした結果の各画素の画素値(輝度値)が、” ymn ij(R(I))”と表記される。 Hereinafter, the brightness information (output brightness adjustment information) “μ mn ij * y mn ij (U o R (I M ))” generated by the output brightness calculating unit 106 is generated using a well-known method. It will be explained that the result of tone mapping can be approximated with high accuracy. Incidentally, the luminance information of the pan-sharpened image generated using well-known methods, is labeled "I Q", the "I Q" the pixel value of each pixel of the result of tone mapping (luminance value), " y mn ij (R (I Q )) ”.
 式(7)は、周知の方法を用いて生成したパンシャープン画像の輝度情報”I”と、マルチスペクトル画像の輝度情報”I”との関係を表す。 Equation (7) represents the relationship between the luminance information “I Q ” of the pan-sharpened image generated using a known method and the luminance information “I M ” of the multispectral image.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 式(7)において、”U(I)”は、マルチスペクトル画像の輝度情報”I”をアップサンプリングした結果を表す。また、”ymn ij(U(I))”は、マルチスペクトル画像の輝度情報”I”をアップサンプリングした結果における、ある特定の画素の画素値(輝度値)を表す。また、”μmn ij”は、局所的特徴量を表す。 In Expression (7), “U (I M )” represents the result of up-sampling the luminance information “I M ” of the multispectral image. “Y mn ij (U (I M ))” represents a pixel value (luminance value) of a specific pixel in the result of up-sampling the luminance information “I M ” of the multispectral image. “Μ mn ij ” represents a local feature amount.
 式(7)と、式(1)乃至式(3)とから、パンシャープン画像を作成した後に、当該パンシャープン画像に対してトーンマッピング処理を実行した場合の画像” R(I)”の輝度値は、下式(8)のように表される。 From the formula (7) and the formulas (1) to (3), after creating a pan-sharpened image, an image when tone mapping processing is performed on the pan-sharpened image "R (I Q ) The brightness value of “” is expressed by the following equation (8).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 本実施形態における画像処理装置100は、パンシャープン画像の輝度情報”I”の大域的特徴成分”B(I)”を、マルチスペクトル画像の輝度情報”I”をアップサンプリングした”U(I)”の大域的特徴成分”BU(I)”を用いて近似可能である(式(9))。以下、その理由を説明する。 The image processing apparatus 100 according to the present embodiment up-samples the global characteristic component “B (I Q )” of the luminance information “I Q ” of the pan-sharpened image and the luminance information “I M ” of the multispectral image. U can be approximated using a "global features component of" B ○ U (I M) "(I M) ( equation (9)). The reason will be described below.
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
 式(7)で示されたとおり、輝度情報”I”は、”U(I)”の輝度値に対し、元のマルチスペクトル画像Mの1画素より小さいスケール(範囲)の局所輝度変化情報”μ”(”μmn ij”)を乗算することにより算出される。係る局所輝度変化情報”μ”は、高解像度のパンクロマチック画像Pから求められた局所的特徴量である。局所輝度変化情報”μ”は、例えば、アップサンプリングする前の(元の)マルチスペクトル画像Mを構成する一画素が表す範囲よりも小さい範囲における輝度値の変化を表す。狭い範囲の輝度変化を表す”μ”がパンシャープン画像の大域的特徴に対して与える効果(影響)は、十分に(例えば、無視されてもよい程度に)小さくなると考えられる。具体的な一例として、平滑化フィルタをかけた輝度成分を大域的特徴として用いる場合を想定する。パンシャープン画像の輝度情報”I”の大域的特徴を算出する際、局所輝度変化情報”μ”の効果は平滑化によって均されることから、大域的特徴に与える効果が小さくなる。これより、パンシャープン画像の輝度情報から求めた大域的特徴成分”B(I)”は、マルチスペクトル画像をアップサンプリングした画像”U(I)”に対する大域的特徴成分”BU(I)”により近似可能である。 As shown in Expression (7), the luminance information “I Q ” is a local luminance change of a scale (range) smaller than one pixel of the original multispectral image M with respect to the luminance value of “U (I M )”. It is calculated by multiplying the information “μ” (“μ mn ij ”). The local luminance change information “μ” is a local feature amount obtained from the high-resolution panchromatic image P. The local luminance change information “μ” represents, for example, a change in luminance value in a range smaller than the range represented by one pixel constituting the (original) multispectral image M before upsampling. It is considered that the effect (influence) that the “μ” representing the luminance change in a narrow range has on the global characteristics of the pan-sharpened image is sufficiently small (for example, it may be ignored). As a specific example, a case is assumed in which a luminance component to which a smoothing filter is applied is used as a global feature. When the global feature of the brightness information “I Q ” of the pan-sharpened image is calculated, the effect of the local brightness change information “μ” is equalized by smoothing, so that the effect on the global feature is reduced. Thus, the global feature component “B (I Q )” obtained from the luminance information of the pan-sharpened image is the global feature component “B U for the image“ U (I M ) ”obtained by up-sampling the multispectral image. (I M ) "can be approximated.
 以下、マルチスペクトル画像”M”の輝度情報”I”をアップサンプリング後にトーンマッピングした結果を”RU(I)”と表記する。まず、式(8)に、式(9)の近似を適用した場合、式(1)乃至式(3)を考慮すると、下式(10)が得られる。 Hereinafter, the luminance information "I M" result of tone mapping after upsampling multispectral image "M" is denoted as "R ○ U (I M) ". First, when the approximation of the equation (9) is applied to the equation (8), the following equation (10) is obtained in consideration of the equations (1) to (3).
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000010
 ある画像をアップサンプリングした前後で、当該画像の輝度レンジは大きく変わらない場合が多い。即ち、マルチスペクトル画像”M”をアップサンプリングする前後のどちらのタイミングでトーンマッピングによるレンジ圧縮を施しても、その結果に大きな差が表れないことが多い。以上より、マルチスペクトル画像”M”の輝度情報”I”をトーンマッピングした後にアップサンプリングした結果を”UR(I)”と表記すると、本実施形態においては、式(11)に示す近似が成立すると考えられる。 In many cases, the brightness range of an image does not change significantly before and after upsampling an image. That is, even if the range compression by tone mapping is performed at any timing before and after the multi-spectral image “M” is up-sampled, there is often no significant difference in the result. From the above, when the luminance information “I M ” of the multispectral image “M” is up-sampled after tone mapping, and expressed as “U o R (I M )”, in this embodiment, the expression (11) The approximation shown is considered to hold.
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000011
 以上より式(10)に対して、式(11)を反映すると、以下の式(12)が得られる。 From the above, when Expression (11) is reflected to Expression (10), the following Expression (12) is obtained.
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000012
 式(12)に示すように、本実施形態における画像処理装置100は、低解像度のマルチスペクトル画像”M”の輝度情報”I”にトーンマッピングを実行した結果(”R(I)”)を用いて、高解像度のパンシャープン画像にトーンマッピングを実行した輝度情報”ymn ij(R(I))”を近似する。これより、本実施形態における画像処理装置100によれば、高解像度の画像(あるいは輝度情報)に対するトーンマッピング処理の実行を回避可能であることから、計算コストが低減される。 As shown in Expression (12), the image processing apparatus 100 according to the present embodiment performs the tone mapping on the luminance information “I M ” of the low-resolution multispectral image “M” (“R (I M )”. ) Is used to approximate luminance information “y mn ij (R (I Q ))” obtained by performing tone mapping on a high-resolution pan-sharpened image. Thus, according to the image processing apparatus 100 of the present embodiment, it is possible to avoid the execution of tone mapping processing for a high-resolution image (or luminance information), and thus the calculation cost is reduced.
 また、上記したように、本実施形態における画像処理装置100は、トーンマッピングされていないマルチスペクトル画像”M”と、パンクロマチック画像”P”とを用いて、局所的特徴情報(局所的特徴量)を抽出する(上記ステップS3006)。マルチスペクトル画像”M”はトーンマッピングされていないことから、マルチスペクトル画像”M”と、パンクロマチック画像”P”との輝度分布は大きく乖離していない。即ち、画像処理装置100は、パンクロマチック画像”P”の輝度情報と、マルチスペクトル画像”M”の輝度情報”I”とを用いて適切な輝度マッチングを実行可能である。これにより、画像処理装置100は、例えば、高階調なパンクロマチック画像Pに由来する画素毎の階調変化(コントラスト)が適切に反映された、適切な階調を有するパンシャープン画像を生成する。 Further, as described above, the image processing apparatus 100 according to the present embodiment uses the multi-spectral image “M” that is not tone-mapped and the panchromatic image “P”, and uses the local feature information (local feature amount). ) Is extracted (step S3006). Since the multispectral image “M” is not tone-mapped, the luminance distribution between the multispectral image “M” and the panchromatic image “P” is not significantly different. That is, the image processing apparatus 100 can execute appropriate luminance matching using the luminance information of the panchromatic image “P” and the luminance information “I M ” of the multispectral image “M”. Thereby, for example, the image processing apparatus 100 generates a pan-sharpened image having an appropriate gradation in which the gradation change (contrast) for each pixel derived from the high-gradation panchromatic image P is appropriately reflected. .
 図7に示す具体例を用いて説明する。図7は、マルチスペクトル画像及びパンクロマチック画像の輝度を正規化した場合の累積ヒストグラムの一例を、模式的に表す図である。図7のAに示す実線は、マルチスペクトル画像に関する正規化された輝度の累積度数を表す。図7のBに示す点線は、トーンマッピング済みのマルチスペクトル画像に関する正規化された輝度の累積度数を表す。図7のCに示す一点鎖線は、パンクロマチック画像に関する正規化された輝度の累積度数を表す。図7におけるG1は、マルチスペクトル画像に対してパンクロマチック画像を輝度マッピングした場合の、ある特定の輝度の間の輝度比を模式的に表す。また、図7におけるG2は、トーンマッピング済みのマルチスペクトル画像に対してパンクロマチック画像を輝度マッピングした場合の、ある特定の輝度の間の輝度比を模式的に表す。 This will be described using a specific example shown in FIG. FIG. 7 is a diagram schematically illustrating an example of a cumulative histogram when the luminances of the multispectral image and the panchromatic image are normalized. The solid line shown in FIG. 7A represents the normalized luminance cumulative frequency for the multispectral image. The dotted line shown in FIG. 7B represents the normalized cumulative frequency of luminance for the tone-mapped multispectral image. A one-dot chain line shown in C of FIG. 7 represents the cumulative frequency of normalized luminance regarding the panchromatic image. G1 in FIG. 7 schematically represents a luminance ratio between specific luminances when a panchromatic image is subjected to luminance mapping with respect to a multispectral image. G2 in FIG. 7 schematically represents a luminance ratio between specific luminances when luminance mapping is performed on a panchromatic image with respect to a multi-spectral image that has been tone-mapped.
 一般的に、トーンマッピングにより、マルチスペクトル画像の輝度は、ある特定の輝度平均を中心とした所定の輝度の範囲に圧縮される。これより、トーンマッピング済みのマルチスペクトル画像の輝度(正規化済)の累積度数は、元のマルチスペクトル画像の輝度(正規化済)の累積度数よりも急峻に変化する。このため、G1とG2とを比較すると、G2により表される輝度比は、G1により表される輝度比よりも小さくなる。これは、画像における局所的なコントラスト(例えば、ある画素における輝度の変化度合い)が小さいことを表すと考えられる。即ち、トーンマッピング済みのマルチスペクトル画像の輝度に対して、パンクロマチック画像の輝度をマッチングすると、パンクロマチック画像に由来する局所的な輝度のコントラストが適切に反映されない可能性がある。具体的には、例えば、人間が当該画像を視認した際に感知する局所的な輝度のコントラストが低下する可能性がある。これに対して、本実施形態における画像処理装置100は、トーンマッピングされていないマルチスペクトル画像と、パンクロマチック画像とを用いて輝度マッチングを行う。これより、画像処理装置100は、パンクロマチック画像に由来する局所的な輝度のコントラストが適切に反映されたパンシャープン画像を生成可能である。 Generally, by tone mapping, the luminance of a multispectral image is compressed to a predetermined luminance range centered on a specific luminance average. As a result, the cumulative frequency of the luminance (normalized) of the tone-mapped multispectral image changes more rapidly than the cumulative frequency of the original multispectral image (normalized). For this reason, when G1 and G2 are compared, the luminance ratio represented by G2 is smaller than the luminance ratio represented by G1. This is considered to indicate that the local contrast in the image (for example, the degree of change in luminance at a certain pixel) is small. That is, if the luminance of the panchromatic image is matched with the luminance of the tone-mapped multispectral image, the local luminance contrast derived from the panchromatic image may not be appropriately reflected. Specifically, for example, there is a possibility that the local brightness contrast sensed when a human visually recognizes the image is lowered. On the other hand, the image processing apparatus 100 according to the present embodiment performs luminance matching using a multispectral image that has not been tone-mapped and a panchromatic image. As a result, the image processing apparatus 100 can generate a pan-sharpened image in which local brightness contrast derived from the panchromatic image is appropriately reflected.
 また、上記したように、本実施形態における画像処理装置100は、低解像度のマルチスペクトル画像”M”に対してトーンマッピング処理を実行することにより、トーンマッピング処理において扱うデータ量を削減する。これにより、画像処理装置100は、トーンマッピング処理が実行された適切な階調を有するパンシャープン画像を高速に作成する。 As described above, the image processing apparatus 100 according to this embodiment reduces the amount of data handled in the tone mapping process by executing the tone mapping process on the low-resolution multispectral image “M”. As a result, the image processing apparatus 100 creates a pan-sharpened image having an appropriate gradation subjected to the tone mapping process at high speed.
 以上をまとめると、本実施形態における画像処理装置100は、パンクロマチック画像とマルチスペクトル画像とを輝度マッチングした結果に基づいて抽出された、パンクロマチック画像に由来する局所的特徴と、マルチスペクトル画像から抽出された大域的特徴とを用いて、パンシャープン処理を実行する。画像処理装置100は、局所的特徴を求める輝度マッチングの際には、元の(トーンマッピングを施していない)マルチスペクトル画像を使用する。また、画像処理装置100は、トーンマッピングにより大域的特徴(大域的な輝度変化)の輝度レンジを圧縮する際には、低解像度なマルチスペクトル画像の輝度情報を用いる。これにより、画像処理装置100は、高解像度のパンシャープン画像を作成した後にトーンマッピング処理を実行する場合に比べて、計算コスト(演算量)を低減可能である。以上より、本実施形態における画像処理装置100によれば、解像度が異なる複数の画像(例えば、パンクロマチック画像とマルチスペクトル画像)を用いて、適切な階調を有する高解像度の画像(例えば、パンシャープン画像)を効率的に生成することが可能である。 In summary, the image processing apparatus 100 according to the present embodiment extracts local features derived from a panchromatic image and multispectral images extracted based on the result of luminance matching between the panchromatic image and the multispectral image. A pan-sharpening process is executed using the extracted global features. The image processing apparatus 100 uses the original (non-tone mapped) multispectral image in luminance matching for obtaining local features. The image processing apparatus 100 uses luminance information of a low-resolution multispectral image when compressing the luminance range of the global feature (global luminance change) by tone mapping. As a result, the image processing apparatus 100 can reduce the calculation cost (computation amount) as compared with the case where the tone mapping process is executed after the high-resolution pan-sharpened image is created. As described above, according to the image processing apparatus 100 of this embodiment, a high-resolution image (for example, panning) having an appropriate gradation using a plurality of images (for example, panchromatic image and multispectral image) having different resolutions. (Sharpened image) can be generated efficiently.
 <第1の実施形態の変形例>
 上記説明した本発明の第1の実施形態における画像処理装置100に対して、以下のような変形例を実現可能である。
<Modification of First Embodiment>
The following modifications can be realized for the image processing apparatus 100 according to the first embodiment of the present invention described above.
 画像処理装置100は、必ずしも、上記式(6)に例示するような、ある特定の画素の画素値と、平滑化画像の画素値との輝度比を局所的特徴量として用いる必要はない。画像処理装置100は、パンシャープン画像の輝度値を、高解像度のパンクロマチック画像に由来する局所的特徴と、低解像度のマルチスペクトル画像に由来する大域的特徴との積の形で表すことが可能な任意の処理を選択してよい。 The image processing apparatus 100 does not necessarily need to use the luminance ratio between the pixel value of a specific pixel and the pixel value of the smoothed image as exemplified in the above formula (6) as the local feature amount. The image processing apparatus 100 can represent the brightness value of a pan-sharpened image in the form of a product of local features derived from a high-resolution panchromatic image and global features derived from a low-resolution multispectral image. Any possible process may be selected.
 例えば、パンシャープン処理の一つとして、マルチスペクトル画像を適宜色分解(色相成分と輝度成分とに分解)し、その輝度成分をパンクロマチック画像の輝度成分に置き換えてから再度色合成(色相成分と輝度成分とを合成)する方法が知られている。画像処理装置100は、係るパンシャープン処理を採用してもよい。この場合、画像処理装置100は、例えば、式(6)の分母として、平滑化画像の輝度ではなく、アップサンプリングされたマルチスペクトル画像の輝度を用いてもよい。式(7)において、アップサンプリングされたマルチスペクトル画像の輝度が、局所的特徴量”μ”に乗算されることから、パンシャープン画像の輝度情報として、パンクロマチック画像の輝度情報がそのまま用いられる。 For example, as one of the pan-sharpening processes, the multispectral image is appropriately color-separated (decomposed into a hue component and a luminance component), the luminance component is replaced with the luminance component of the panchromatic image, and then the color synthesis (hue component) And a luminance component are known. The image processing apparatus 100 may employ such pan-sharpening processing. In this case, for example, the image processing apparatus 100 may use the brightness of the up-sampled multispectral image instead of the brightness of the smoothed image as the denominator of Expression (6). In equation (7), the brightness of the up-sampled multispectral image is multiplied by the local feature “μ”, so that the brightness information of the panchromatic image is used as it is as the brightness information of the pan-sharpened image. .
 また、画像処理装置100は、マルチスペクトル画像における各バンド(スペクトルバンド)の値が輝度に比例する色空間(例えばHCS法)を用いてもよい。具体的には、画像処理装置100は、入力装置200から提供されたマルチスペクトル画像の色空間を、上記したような色空間に変換してもよい。この場合、画像合成部107における、色分解及び合成処理を簡略化することが可能である。拡大処理部101においてアップサンプリングされたマルチスペクトル画像の輝度と、出力輝度算出部27で得られた出力輝度調整情報とを乗算することで、高速にパンシャープン画像を生成することが可能である。 Further, the image processing apparatus 100 may use a color space (for example, HCS method) in which the value of each band (spectral band) in the multispectral image is proportional to the luminance. Specifically, the image processing apparatus 100 may convert the color space of the multispectral image provided from the input apparatus 200 into the color space as described above. In this case, it is possible to simplify the color separation and composition processing in the image composition unit 10 7. By multiplying the luminance of the multispectral image up-sampled by the enlargement processing unit 101 by the output luminance adjustment information obtained by the output luminance calculating unit 27, it is possible to generate a pan-sharpened image at high speed. .
 <第2の実施形態>
 次に、本発明の第2の実施形態について、図4を参照して説明する。図4は、本発明の第2の実施形態に係る画像処理装置400の機能的な構成を例示するブロック図である。
<Second Embodiment>
Next, a second embodiment of the present invention will be described with reference to FIG. FIG. 4 is a block diagram illustrating a functional configuration of an image processing apparatus 400 according to the second embodiment of the present invention.
 図4に例示するように、本実施形態における画像処理装置400は、階調調整画像生成部(階調調整画像生成手段)401と、合成画像生成部(合成画像生成手段)402とを備える。階調調整画像生成部401と、合成画像生成部402との間は、任意の通信方法を用いて通信可能に接続されていてよい。以下、画像処理装置400を構成する構成要素について説明する。 As illustrated in FIG. 4, the image processing apparatus 400 according to the present embodiment includes a gradation adjustment image generation unit (gradation adjustment image generation unit) 401 and a composite image generation unit (synthesis image generation unit) 402. The gradation adjustment image generation unit 401 and the composite image generation unit 402 may be communicably connected using an arbitrary communication method. In the following, components constituting the image processing apparatus 400 will be described.
 階調調整画像生成部401は、1以上の色情報を有する第1の画像の階調を変換することにより第1の階調変換画像を生成する。第1の画像は、例えば、マルチスペクトル画像であってもよい。階調調整画像生成部401は、例えば、第1の画像(例えば、マルチスペクトル画像)に含まれる各画素の輝度の階調を変換してもよい。この場合、階調調整画像生成部401は、例えば、第1の画像に対してトーンマッピング処理を実行することにより、第1の画像の階調をレンジ圧縮してもよい。また、階調調整画像生成部401は、第1の画像の階調を変換するとともに、当該階調が変換された第1の画像の解像度を変換してもよい。上記したような階調調整画像生成部401は、例えば、上記第1の実施形態における輝度抽出部102、トーンマッピング処理部103、及び拡大処理部104等に相当してもよいが、これには限定されない。 The gradation adjustment image generation unit 401 generates a first gradation conversion image by converting the gradation of the first image having one or more color information. The first image may be, for example, a multispectral image. For example, the gradation adjustment image generation unit 401 may convert the luminance gradation of each pixel included in the first image (for example, a multispectral image). In this case, the gradation adjusted image generation unit 401 may perform range compression on the gradation of the first image, for example, by executing tone mapping processing on the first image. The gradation adjustment image generation unit 401 may convert the gradation of the first image and the resolution of the first image in which the gradation is converted. The tone adjustment image generation unit 401 as described above may correspond to, for example, the luminance extraction unit 102, the tone mapping processing unit 103, the enlargement processing unit 104, and the like in the first embodiment. It is not limited.
 合成画像生成部402は、上記第1の画像よりも解像度が高い第2の画像の階調に関する特徴情報と、上記第1の階調変換画像の階調とを用いて、上記第1の画像を調整する。その結果に基づいて、合成画像生成部402は、1以上の色情報を有し、少なくとも上記第1の画像よりも解像度が高い合成画像を生成する。係る第2の画像は、例えば、マルチスペクトル画像よりも解像度が高いパンクロマチック画像であってもよい。係る特徴情報は、例えば、上記第1の画像と、上記第2の画像と、に基づいて生成された、上記第2の画像における局所的な特徴を表す情報であってもよい。上記特徴情報は、例えば、上記第1の実施形態における局所的特徴情報(局所的特徴量)に相当してもよい。また、上記合成画像は、第1の実施形態におけるパンシャープン画像であってもよい。 The composite image generation unit 402 uses the feature information regarding the gradation of the second image having a higher resolution than that of the first image and the gradation of the first gradation conversion image, and uses the first image. Adjust. Based on the result, the composite image generation unit 402 generates a composite image having one or more pieces of color information and having a resolution higher than that of the first image. The second image may be, for example, a panchromatic image having a higher resolution than the multispectral image. The feature information may be information representing a local feature in the second image generated based on the first image and the second image, for example. The feature information may correspond to, for example, local feature information (local feature amount) in the first embodiment. Further, the composite image may be the pan-sharpened image in the first embodiment.
 具体的には、合成画像生成部402は、例えば、上記第2の画像の階調(例えば、輝度の階調)を、上記第1の画像の階調に合せて変換した第2の階調変換画像における局所的な特徴を表す情報を、上記特徴情報として抽出してもよい。上記特徴情報は、例えば、上記第2の画像と同じ解像度を有する情報であってもよい。また、合成画像生成部402は、例えば、上記第1の画像の解像度を変換してもよい。そして、合成画像生成部402は、例えば、上記特徴情報と、上記第1の階調変換情報と、を用いて、当該解像度が変換された上記第1の画像の階調を調整することにより、上記合成画像を生成してもよい。係る合成画像生成部402は、例えば、上記第1の実施形態における局所的特徴抽出部105、出力輝度算出部106、及び、画像合成部107等に相当してもよいが、これには限定されない。 Specifically, the composite image generation unit 402 converts, for example, the second gradation obtained by converting the gradation (for example, luminance gradation) of the second image in accordance with the gradation of the first image. Information representing local features in the converted image may be extracted as the feature information. The feature information may be information having the same resolution as that of the second image, for example. Further, the composite image generation unit 402 may convert the resolution of the first image, for example. Then, the composite image generation unit 402 uses, for example, the feature information and the first gradation conversion information to adjust the gradation of the first image in which the resolution is converted. The composite image may be generated. The composite image generation unit 402 may correspond to, for example, the local feature extraction unit 105, the output luminance calculation unit 106, and the image synthesis unit 107 in the first embodiment, but is not limited thereto. .
 上記のように構成された画像処理装置400は、例えば、第1の画像に対してトーンマッピング処理等の階調変換処理を実行することにより、第1の階調変換画像を生成する。この場合、係る第1の画像は低解像度のマルチスペクトル画像であってもよいので、階調変換処理に要する計算コストは比較的低い。 The image processing apparatus 400 configured as described above generates, for example, a first tone conversion image by executing tone conversion processing such as tone mapping processing on the first image. In this case, since the first image may be a low-resolution multispectral image, the calculation cost required for the gradation conversion process is relatively low.
 また、画像処理装置400は、第2の画像の階調に関する特徴情報と、上記第1の階調変換画像の階調と用いて、上記第1の画像を調整することにより、上記合成画像を生成する。上記第2の画像は、例えば、パンクロマチック画像であってもよい。 Further, the image processing apparatus 400 adjusts the first image by using the feature information relating to the gradation of the second image and the gradation of the first gradation conversion image, thereby adjusting the composite image. Generate. The second image may be a panchromatic image, for example.
 画像処理装置400は、第1の画像(例えばマルチスペクトル画像)あるいは第2の画像(例えばパンクロマチック画像)の階調(例えば輝度の階調)に基づいた適切な階調を有する合成画像を生成することが可能である。なぜならば、画像処理装置400は、例えば、高解像度の第2の画像に含まれる特徴情報と、トーンマッピング等の階調変換が行われる前の元の第1画像とに基づいて、上記特徴情報を生成するからである。そして、画像処理装置400は、係る特徴情報と、上記第1の階調変換画像の階調とに基づいて、上記第1の画像を調整するからである。また、画像処理装置400は、解像度が高い画像を直接的な対象とした階調変換処理を実行する必要がないことから、合成画像の生成に要する計算コストを削減可能である。即ち、画像処理装置400は、合成画像を効率的に生成可能である。 The image processing device 400 generates a composite image having an appropriate gradation based on the gradation (for example, luminance gradation) of the first image (for example, multispectral image) or the second image (for example, panchromatic image). Is possible. This is because the image processing apparatus 400, for example, based on the feature information included in the high-resolution second image and the original first image before tone conversion such as tone mapping is performed. It is because it produces | generates. This is because the image processing apparatus 400 adjusts the first image based on the feature information and the gradation of the first gradation conversion image. Further, since the image processing apparatus 400 does not need to execute gradation conversion processing that directly targets an image with a high resolution, the calculation cost required for generating a composite image can be reduced. That is, the image processing apparatus 400 can efficiently generate a composite image.
 以上より、本実施形態における画像処理装置400は、解像度が異なる複数の画像(例えば、上記第1の画像と第2の画像)を用いて、適切な階調を有する高解像度の画像(例えば、上記合成画像)を効率的に生成することが可能である。 As described above, the image processing apparatus 400 according to the present embodiment uses a plurality of images with different resolutions (for example, the first image and the second image) to generate a high-resolution image (for example, an appropriate gradation) (for example, The above composite image) can be generated efficiently.
 <第2の実施形態の変形例>
 上記第2の実施形態の変形例について説明する。なお、以下の説明において、上記第2の実施形態と同様の構成については、同様の参照番号を付すことにより、詳細な説明を省略する。
<Modification of Second Embodiment>
A modification of the second embodiment will be described. In the following description, the same reference numerals are assigned to the same configurations as those in the second embodiment, and the detailed description is omitted.
 本変形例における画像処理装置400は、図5に例示するように、特徴情報生成部(特徴情報生成手段)403を更に備えてもよい。なお、階調調整画像生成部401及び合成画像生成部402は、上記第2の実施形態と同様としてもよい。 The image processing apparatus 400 according to the present modification may further include a feature information generation unit (feature information generation means) 403 as illustrated in FIG. Note that the gradation adjustment image generation unit 401 and the composite image generation unit 402 may be the same as those in the second embodiment.
 特徴情報生成部403は、上記第1及び第2の画像の階調を抽出し、上記第2の画像の階調を上記第1の画像の階調の範囲に合せて変換することにより、第2の階調変換画像を生成する。特徴情報生成部403は、生成した上記第2の階調変換画像における局所的な階調の変化の程度を上記特徴情報として抽出する。係る特徴情報生成部403は、例えば、上記第1の実施形態における局所的特徴抽出部105に相当してもよい。なお、本変形例において、合成画像生成部402は、特徴情報生成部403が生成した上記特徴情報を用いて、上記合成画像を生成してもよい。 The feature information generation unit 403 extracts the gradations of the first and second images, and converts the gradation of the second image according to the gradation range of the first image, thereby converting the first image and the second image. 2 gradation conversion images are generated. The feature information generation unit 403 extracts the degree of local gradation change in the generated second gradation conversion image as the feature information. The feature information generation unit 403 may correspond to, for example, the local feature extraction unit 105 in the first embodiment. In this modification, the composite image generation unit 402 may generate the composite image using the feature information generated by the feature information generation unit 403.
 本変形例における特徴情報生成部403は、例えば、輝度マッチング処理等を実行することにより、第2の画像の階調を、第1の画像の階調の範囲に合せて適切に変換し、第2の階調変換画像を生成可能である。なぜならば、係る第1の画像は、トーンマッピング等の階調変換処理が実行されていないことから、例えば、第1の画像の階調と、第2の画像の階調との間の相関関係が保たれているからである。また、本変形例における特徴情報生成部403は、第2の階調変換画像における局所的な階調の変化の程度を、上記特徴情報として抽出する。 The feature information generation unit 403 in the present modification appropriately converts the gradation of the second image in accordance with the gradation range of the first image, for example, by executing a luminance matching process or the like. 2 gradation conversion images can be generated. This is because the first image has not been subjected to gradation conversion processing such as tone mapping, and therefore, for example, the correlation between the gradation of the first image and the gradation of the second image This is because it is maintained. In addition, the feature information generation unit 403 in the present modification extracts the degree of local gradation change in the second gradation conversion image as the feature information.
 合成画像生成部402は、高解像度な第2の階調変換画像の局所的な特徴を表す特徴情報と、トーンマッピング等により階調が変換(圧縮)された第1の階調変換画像とを用いて、高解像度な輝度情報(例えば、上記出力輝度調整情報)を生成可能である。そして、合成画像生成部402は、係る高解像度な輝度情報と、上記第1の画像とに基づいて、上記合成画像を生成する。以上より本変形例における画像処理装置400は、マルチスペクトル画像及びパンクロマチック画像の階調(例えば輝度の階調)に基づいた適切な階調を有する合成画像を生成可能である。また、本変形例における画像処理装置400は、上記第2の実施形態と同様の構成を有することから、上記第2の実施形態における画像処理装置400と同様の効果を奏する。 The composite image generation unit 402 obtains feature information representing local features of the high-resolution second tone-converted image and the first tone-converted image whose tone is converted (compressed) by tone mapping or the like. By using this, it is possible to generate high-resolution luminance information (for example, the output luminance adjustment information). Then, the composite image generation unit 402 generates the composite image based on the high-resolution luminance information and the first image. As described above, the image processing apparatus 400 according to the present modification can generate a composite image having an appropriate gradation based on the gradation (for example, the luminance gradation) of the multispectral image and the panchromatic image. In addition, since the image processing apparatus 400 in the present modification has the same configuration as that of the second embodiment, the same effect as that of the image processing apparatus 400 in the second embodiment can be obtained.
  <ハードウェア及びソフトウェア・プログラム(コンピュータ・プログラム)の構成>
 以下、上記説明した各実施形態を実現可能なハードウェア構成について説明する。
<Configuration of hardware and software program (computer program)>
Hereinafter, a hardware configuration capable of realizing each of the above-described embodiments will be described.
 以下の説明においては、上記各実施形態において説明した画像処理装置(100、400)をまとめて、単に「画像処理装置」と称する。またこれら画像処理装置の各構成要素を、単に「画像処理装置の構成要素」と称する。 In the following description, the image processing apparatuses (100, 400) described in the above embodiments are collectively referred to simply as “image processing apparatus”. Each component of the image processing apparatus is simply referred to as “component of the image processing apparatus”.
 上記各実施形態において説明した画像処理装置は、1つ又は複数の専用のハードウェア装置により構成されてもよい。その場合、上記各図(図1、図4、及び図5)に示した各構成要素は、その一部又は全部を統合したハードウェア(処理ロジックを実装した集積回路あるいは記憶デバイス等)を用いて実現されてもよい。 The image processing apparatus described in each of the above embodiments may be configured by one or a plurality of dedicated hardware devices. In that case, each component shown in each of the above figures (FIGS. 1, 4, and 5) uses hardware (an integrated circuit or a storage device on which processing logic is mounted) that is partially or fully integrated. May be realized.
 画像処理装置が専用のハードウェアにより実現される場合、係る画像処理装置の構成要素は、例えば、それぞれの機能を提供可能な回路機構(例えば、SoC(System on a Chip)等の集積回路)を用いて実装されてもよい。この場合、画像処理装置の構成要素が保持するデータは、例えば、SoCとして統合されたRAM(Random Access Memory)領域やフラッシュメモリ領域、あるいは、当該SoCに接続された記憶デバイス(半導体記憶装置等)に記憶されてもよい。また、この場合、画像処理装置の各構成要素を接続する通信回線としては、周知の通信バスを採用してもよい。また、各構成要素を接続する通信回線はバス接続に限らず、それぞれの構成要素間をピアツーピアで接続してもよい。画像処理装置を複数のハードウェア装置により構成する場合、それぞれのハードウェア装置の間は、任意の通信手段(有線、無線、またはそれらの組み合わせ)により通信可能に接続されていてもよい。 When the image processing apparatus is realized by dedicated hardware, the components of the image processing apparatus include, for example, a circuit mechanism that can provide each function (for example, an integrated circuit such as a SoC (System on a Chip)). May be implemented. In this case, the data held by the components of the image processing apparatus is, for example, a RAM (Random Access Memory) area integrated as SoC, a flash memory area, or a storage device (such as a semiconductor storage device) connected to the SoC. May be stored. In this case, a well-known communication bus may be employed as a communication line that connects each component of the image processing apparatus. Further, the communication line connecting each component is not limited to bus connection, and each component may be connected by peer-to-peer. When the image processing apparatus is configured by a plurality of hardware devices, each hardware device may be communicably connected by any communication means (wired, wireless, or a combination thereof).
 また、上述した画像処理装置又はその構成要素は、図6に例示するような汎用のハードウェアと、係るハードウェアによって実行される各種ソフトウェア・プログラム(コンピュータ・プログラム)とによって構成されてもよい。この場合、画像処理装置は、任意の数の、汎用のハードウェア装置及びソフトウェア・プログラムにより構成されてもよい。即ち、画像処理装置を構成する構成要素毎に、個別のハードウェア装置が割り当てられてもよく、複数の構成要素が、一つのハードウェア装置を用いて実現されてもよい。 Further, the above-described image processing apparatus or its constituent elements may be configured by general-purpose hardware as illustrated in FIG. 6 and various software programs (computer programs) executed by the hardware. In this case, the image processing apparatus may be configured by an arbitrary number of general-purpose hardware devices and software programs. That is, an individual hardware device may be assigned to each component constituting the image processing apparatus, and a plurality of components may be realized using one hardware device.
 図6における演算装置601は、汎用のCPU(中央処理装置:Central Processing Unit)やマイクロプロセッサ等の演算処理装置である。演算装置601は、例えば後述する不揮発性記憶装置603に記憶された各種ソフトウェア・プログラムを記憶装置602に読み出し、係るソフトウェア・プログラムに従って処理を実行してもよい。例えば、上記各実施形態における画像処理装置の構成要素は、演算装置601により実行されるソフトウェア・プログラムとして実現されてもよい。 The arithmetic device 601 in FIG. 6 is an arithmetic processing device such as a general-purpose CPU (Central Processing Unit) or a microprocessor. The arithmetic device 601 may read various software programs stored in a nonvolatile storage device 603, which will be described later, into the storage device 602, and execute processing according to the software programs. For example, the components of the image processing apparatus in each of the above embodiments may be realized as a software program executed by the arithmetic device 601.
 記憶装置602は、演算装置601から参照可能な、RAM等のメモリ装置であり、ソフトウェア・プログラムや各種データ等を記憶する。なお、記憶装置602は、揮発性のメモリ装置であってもよい。 The storage device 602 is a memory device such as a RAM that can be referred to from the arithmetic device 601, and stores software programs, various data, and the like. Note that the storage device 602 may be a volatile memory device.
 不揮発性記憶装置603は、例えば磁気ディスクドライブや、フラッシュメモリによる半導体記憶装置のような、不揮発性の記憶装置である。不揮発性記憶装置603は、各種ソフトウェア・プログラムやデータ等を記憶可能である。 The non-volatile storage device 603 is a non-volatile storage device such as a magnetic disk drive or a semiconductor storage device using flash memory. The nonvolatile storage device 603 can store various software programs, data, and the like.
 ネットワークインタフェース606は、通信ネットワークに接続するインタフェース装置であり、例えば有線及び無線のLAN(Local Area Network)接続用インタフェース装置や、SAN接続用インタフェース装置等を採用してもよい。 The network interface 606 is an interface device connected to a communication network. For example, a wired and wireless LAN (Local Area Network) connection interface device, a SAN connection interface device, or the like may be employed.
 ドライブ装置604は、例えば、後述する記録媒体605に対するデータの読み込みや書き込みを処理する装置である。 The drive device 604 is, for example, a device that processes reading and writing of data with respect to a recording medium 605 described later.
 記録媒体605は、例えば光ディスク、光磁気ディスク、半導体フラッシュメモリ等、データを記録可能な任意の記録媒体である。 The recording medium 605 is an arbitrary recording medium capable of recording data, such as an optical disk, a magneto-optical disk, and a semiconductor flash memory.
 入出力インタフェース607は、外部装置との間の入出力を制御する装置である。 The input / output interface 607 is a device that controls input / output with an external device.
 上述した各実施形態を例に説明した本発明における画像処理装置(あるいはその構成要素)は、例えば、図6に例示するハードウェア装置に対して、上記各実施形態において説明した機能を実現可能なソフトウェア・プログラムを供給することにより、実現されてもよい。より具体的には、例えば、係る装置に対して供給したソフトウェア・プログラムを、演算装置601が実行することによって、本発明が実現されてもよい。この場合、係るハードウェア装置で稼働しているオペレーティングシステムや、データベース管理ソフト、ネットワークソフト、仮想環境基盤等のミドルウェアなどが各処理の一部を実行してもよい。 The image processing apparatus (or its constituent elements) according to the present invention described with the above-described embodiments as an example can realize the functions described in the above-described embodiments with respect to, for example, the hardware apparatus illustrated in FIG. It may be realized by supplying a software program. More specifically, for example, the present invention may be realized by causing the arithmetic device 601 to execute a software program supplied to such a device. In this case, an operating system running on the hardware device, database management software, network software, middleware such as a virtual environment platform, etc. may execute part of each process.
 更に、上記ソフトウェア・プログラムは記録媒体605に記録されてもよい。この場合、上記ソフトウェア・プログラムは、上記画像処理装置等の出荷段階、あるいは運用段階等において、適宜ドライブ装置604を通じて不揮発性記憶装置603に格納されるよう構成されてもよい。 Furthermore, the software program may be recorded on the recording medium 605. In this case, the software program may be configured to be stored in the nonvolatile storage device 603 through the drive device 604 as appropriate at the shipping stage or the operation stage of the image processing apparatus.
 なお、上記の場合において、上記ハードウェアへの各種ソフトウェア・プログラムの供給方法は、出荷前の製造段階、あるいは出荷後のメンテナンス段階等において、適当な治具を利用して当該装置内にインストールする方法を採用してもよい。また、各種ソフトウェア・プログラムの供給方法は、インターネット等の通信回線を介して外部からダウンロードする方法等のように、現在では一般的な手順を採用してもよい。 In the above case, the method of supplying various software programs to the hardware is installed in the apparatus using an appropriate jig in the manufacturing stage before shipment or the maintenance stage after shipment. A method may be adopted. As a method for supplying various software programs, a general procedure may be adopted at present, such as a method of downloading from the outside via a communication line such as the Internet.
 そして、このような場合において、本発明は、係るソフトウェア・プログラムを構成するコード、あるいは係るコードが記録されたところの、コンピュータ読み取り可能な記録媒体によって構成されると捉えることができる。この場合、係る記録媒体は、ハードウェア装置と独立した媒体に限らず、LANやインターネットなどにより伝送されたソフトウェア・プログラムをダウンロードして記憶又は一時記憶した記録媒体を含む。 In such a case, the present invention can be understood to be constituted by a code constituting the software program or a computer-readable recording medium on which the code is recorded. In this case, the recording medium is not limited to a medium independent of the hardware device, but includes a recording medium in which a software program transmitted via a LAN or the Internet is downloaded and stored or temporarily stored.
 また、上述した画像処理装置は、図6に例示するハードウェア装置を仮想化した仮想化環境と、当該仮想化環境において実行される各種ソフトウェア・プログラム(コンピュータ・プログラム)とによって構成されてもよい。この場合、図6に例示するハードウェア装置の構成要素は、当該仮想化環境における仮想デバイスとして提供される。なお、この場合も、図6に例示するハードウェア装置を物理的な装置として構成した場合と同様の構成にて、本発明を実現可能である。 Further, the above-described image processing apparatus may be configured by a virtual environment obtained by virtualizing the hardware device illustrated in FIG. 6 and various software programs (computer programs) executed in the virtual environment. . In this case, the components of the hardware device illustrated in FIG. 6 are provided as virtual devices in the virtual environment. In this case as well, the present invention can be realized with the same configuration as when the hardware device illustrated in FIG. 6 is configured as a physical device.
 以上、本発明を、上述した模範的な実施形態に適用した例として説明した。しかしながら、本発明の技術的範囲は、上述した各実施形態に記載した範囲には限定されない。当業者には、係る実施形態に対して多様な変更又は改良を加えることが可能であることは明らかである。そのような場合、係る変更又は改良を加えた新たな実施形態も、本発明の技術的範囲に含まれ得る。更に、上述した各実施形態、あるいは、係る変更又は改良を加えた新たな実施形態を組み合わせた実施形態も、本発明の技術的範囲に含まれ得る。そしてこのことは、請求の範囲に記載した事項から明らかである。
 この出願は、2015年9月1日に出願された日本出願特願2015-171875を基礎とする優先権を主張し、その開示の全てをここに取り込む。
In the above, this invention was demonstrated as an example applied to exemplary embodiment mentioned above. However, the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various modifications and improvements can be made to such embodiments. In such a case, new embodiments to which such changes or improvements are added can also be included in the technical scope of the present invention. Furthermore, the embodiments described above, or embodiments obtained by combining the new embodiments with such changes or improvements can also be included in the technical scope of the present invention. This is clear from the matters described in the claims.
This application claims the priority on the basis of Japanese application Japanese Patent Application No. 2015-171875 for which it applied on September 1, 2015, and takes in those the indications of all here.
 100 画像処理装置
 101 拡大処理部
 102 輝度抽出部
 103 トーンマッピング処理部
 104 拡大処理部
 105 局所的特徴抽出部
 106 出力輝度算出部
 107 画像合成部
 200 入力装置
 201 マルチスペクトル画像入力部
 202 パンクロマチック画像入力部
 400 画像処理装置
 401 階調調整画像生成部
 402 合成画像生成部
 403 特徴情報生成部
DESCRIPTION OF SYMBOLS 100 Image processing apparatus 101 Enlargement process part 102 Luminance extraction part 103 Tone mapping process part 104 Enlargement process part 105 Local feature extraction part 106 Output luminance calculation part 107 Image composition part 200 Input device 201 Multispectral image input part 202 Panchromatic image input Unit 400 image processing apparatus 401 gradation adjustment image generation unit 402 composite image generation unit 403 feature information generation unit

Claims (13)

  1.  1以上の色情報を有する第1の画像の階調を変換することにより第1の階調変換画像を生成する階調調整画像生成手段と、
     前記第1の画像と、前記第1の画像より解像度が高い第2の画像と、に基づいて生成された、前記第2の画像の階調に関する特徴情報と、前記第1の階調変換画像の階調とを用いて、前記第1の画像を調整することにより、1以上の色情報を有し、少なくとも前記第1の画像よりも解像度が高い画像である合成画像を生成する合成画像生成手段と、
    を備える画像処理装置。
    Gradation-adjusted image generating means for generating a first gradation-converted image by converting the gradation of the first image having one or more color information;
    Feature information related to the gradation of the second image generated based on the first image and a second image having a higher resolution than the first image, and the first gradation-converted image By adjusting the first image using the gradation of the image, a composite image generation that generates a composite image having at least one color information and at least a higher resolution than the first image is performed. Means,
    An image processing apparatus comprising:
  2.  前記合成画像生成手段は、前記第2の画像の階調を前記第1の画像の階調の範囲に合せて変換した第2の階調変換画像に基づいて生成された前記特徴情報と、前記第1の階調変換画像の階調と、を用いて、前記第1の画像を調整することにより、前記合成画像を生成する
    請求項1に記載の画像処理装置。
    The composite image generation means includes the feature information generated based on a second gradation conversion image obtained by converting the gradation of the second image according to the gradation range of the first image, The image processing apparatus according to claim 1, wherein the composite image is generated by adjusting the first image using the gradation of the first gradation-converted image.
  3.  階調調整画像生成手段は、前記第1の画像をトーンマッピング処理することにより、前記第1の画像の階調を特定の階調の範囲に圧縮し、階調が圧縮された前記第1の画像を前記第2の画像の解像度に変換することにより、前記第1の階調変換画像を生成する
    請求項2に記載の画像処理装置。
    The gradation adjusted image generation means compresses the gradation of the first image to a specific gradation range by performing tone mapping processing on the first image, and the first image whose gradation has been compressed is compressed. The image processing apparatus according to claim 2, wherein the first gradation conversion image is generated by converting an image into a resolution of the second image.
  4.  前記合成画像生成手段は、前記トーンマッピング処理が実行されていない前記第1の画像の解像度を前記第2の画像の解像度に変換し、
     前記第2の画像の階調に関する前記特徴情報と、前記第1の階調変換画像の階調とを用いて、当該解像度が変換された前記第1の画像の階調を調整することにより、前記合成画像を生成する
    請求項3に記載の画像処理装置。
    The synthesized image generating means converts the resolution of the first image, which has not been subjected to the tone mapping process, into the resolution of the second image;
    By adjusting the gradation of the first image whose resolution has been converted using the feature information relating to the gradation of the second image and the gradation of the first gradation-converted image, The image processing apparatus according to claim 3, wherein the composite image is generated.
  5.  前記第1及び第2の画像の階調を抽出し、前記第2の画像の階調を前記第1の画像の階調の範囲に合せて変換することにより、前記第2の階調変換画像を生成し、生成した前記第2の階調変換画像における局所的な階調の変化の程度を前記特徴情報として抽出する特徴情報生成手段を更に備え、
     前記合成画像生成手段は、前記特徴情報生成手段が生成した前記特徴情報を用いて、前記合成画像を生成する
    請求項4に記載の画像処理装置。
    The second gradation-converted image is obtained by extracting the gradation of the first and second images and converting the gradation of the second image in accordance with the gradation range of the first image. And a feature information generating means for extracting the degree of local gradation change in the generated second gradation conversion image as the feature information,
    The image processing apparatus according to claim 4, wherein the composite image generation unit generates the composite image using the feature information generated by the feature information generation unit.
  6.  前記特徴情報生成手段は、前記第2の階調変換画像に含まれる画素毎に、当該画素を含む複数の画素の階調に基づいて算出した大域的な特徴量と、当該画素の階調と、に基づいて、当該画素に関する局所的な特徴量である前記特徴情報を生成する
    請求項5に記載の画像処理装置。
    The feature information generation unit, for each pixel included in the second gradation conversion image, a global feature amount calculated based on the gradation of a plurality of pixels including the pixel, and the gradation of the pixel The image processing apparatus according to claim 5, wherein the feature information that is a local feature amount related to the pixel is generated on the basis of.
  7.  前記特徴情報生成手段は、前記第2の階調変換画像に含まれる画素毎に、前記第2の階調変換画像を平滑化することにより算出した大域的な特徴量と、当該画素の階調とに基づいて、当該画素に関する局所的な特徴量である前記特徴情報を生成する
    請求項5に記載の画像処理装置。
    The feature information generation unit calculates a global feature amount calculated by smoothing the second gradation conversion image for each pixel included in the second gradation conversion image, and a gradation of the pixel. The image processing apparatus according to claim 5, wherein the feature information, which is a local feature amount related to the pixel, is generated based on.
  8.  前記特徴情報生成手段は、前記第2の階調変換画像に含まれる画素の階調を、前記第2の階調変換画像における輝度の大域的な特徴を表す大域的特徴量と、前記第2の階調変換画像における輝度の局所的な特徴量を表す局所的特徴量とに分解し、当該画素の輝度と、前記大域的特徴量とに基づいて、前記局所的特徴量を前記特徴情報として算出する
    請求項5に記載の画像処理装置。
    The feature information generation unit is configured to change a gradation of a pixel included in the second gradation-converted image, a global feature amount representing a global characteristic of luminance in the second gradation-converted image, and the second Are decomposed into local feature amounts representing local feature amounts of luminance in the gradation-converted image, and based on the luminance of the pixel and the global feature amount, the local feature amount is used as the feature information. The image processing apparatus according to claim 5, which calculates the image processing apparatus.
  9.  前記合成画像生成手段は、前記第2の階調変換画像に含まれる画素毎に算出された前記特徴情報と、前記第1の階調変換画像に含まれる画素毎の階調と、に基づいて算出した値を用いて、前記第1の画像を前記第2の画像の解像度に変換した画像に含まれる画素毎の階調を調整する
    請求項5乃至請求項8のいずれかに記載の画像処理装置。
    The composite image generation means is based on the feature information calculated for each pixel included in the second gradation conversion image and the gradation for each pixel included in the first gradation conversion image. The image processing according to any one of claims 5 to 8, wherein a gradation for each pixel included in an image obtained by converting the first image into the resolution of the second image is adjusted using the calculated value. apparatus.
  10.  前記合成画像生成手段は、前記第2の階調変換画像に含まれる画素毎に算出された前記特徴情報と、前記第1の階調変換画像に含まれる画素毎の階調とを乗算した値を用いて、前記第1の画像を前記第2の画像の解像度に変換した画像に含まれる画素毎の階調を置き換えることにより、前記合成画像を生成する
    請求項5乃至請求項8のいずれかに記載の画像処理装置。
    The composite image generating means is a value obtained by multiplying the feature information calculated for each pixel included in the second gradation conversion image by a gradation for each pixel included in the first gradation conversion image. 9. The composite image is generated by replacing a gradation for each pixel included in an image obtained by converting the first image into the resolution of the second image using the first image. An image processing apparatus according to 1.
  11.  1以上の色情報を有する第1の画像に含まれる画素の階調を変換することにより第1の階調変換画像を生成し、
     前記第1の画像と、前記第1の画像より解像度が高い第2の画像と、に基づいて生成された、前記第2の画像の階調に関する特徴情報と、前記第1の階調変換画像及び前記第1の画像と、を用いて、1以上の色情報を有し、少なくとも前記第1の画像よりも解像度が高い画像である合成画像を生成する
    画像処理方法。
    Generating a first gradation-converted image by converting the gradation of pixels included in the first image having one or more color information;
    Feature information related to the gradation of the second image generated based on the first image and a second image having a higher resolution than the first image, and the first gradation-converted image And an image processing method for generating a composite image having at least one color information and having a resolution higher than that of the first image.
  12.  1以上の色情報を有する第1の画像に含まれる画素の階調を変換することにより第1の階調変換画像を生成する処理と、
     前記第1の画像と、前記第1の画像より解像度が高い第2の画像と、に基づいて生成された、前記第2の画像の階調に関する特徴情報と、前記第1の階調変換画像及び前記第1の画像と、を用いて、1以上の色情報を有し、少なくとも前記第1の画像よりも解像度が高い画像である合成画像を生成する処理と、をコンピュータに実行させる
    画像処理プログラムが記憶された記録媒体。
    Processing for generating a first gradation-converted image by converting the gradation of pixels included in the first image having one or more color information;
    Feature information related to the gradation of the second image generated based on the first image and a second image having a higher resolution than the first image, and the first gradation-converted image And the first image, and causing the computer to execute a process of generating a composite image that has one or more color information and has at least a higher resolution than the first image. A recording medium that stores a program.
  13.  請求項1乃至請求項12のいずれかに記載の画像処理装置と、
     マルチスペクトル画像と、前記マルチスペクトル画像よりも解像度が高いパンクロマチック画像を入力として受け付け、前記マルチスペクトル画像と前記第1の画像として前記画像処理装置に提供し、前記パンクロマチック画像を前記第2の画像として前記画像処理装置に提供する入力装置と、
     前記画像処理装置により生成された前記合成画像を出力する出力装置と、を含む
    画像処理システム。
    An image processing apparatus according to any one of claims 1 to 12,
    A multispectral image and a panchromatic image having a resolution higher than that of the multispectral image are accepted as inputs, and the multispectral image and the first image are provided to the image processing apparatus, and the panchromatic image is provided to the second spectral image. An input device provided as an image to the image processing device;
    And an output device that outputs the composite image generated by the image processing device.
PCT/JP2016/003969 2015-09-01 2016-08-31 Image processing device, image processing method, and recording medium whereon image processing program is stored WO2017038090A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017537550A JP6744317B2 (en) 2015-09-01 2016-08-31 Image processing apparatus, image processing method, and image processing program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-171875 2015-09-01
JP2015171875 2015-09-01

Publications (1)

Publication Number Publication Date
WO2017038090A1 true WO2017038090A1 (en) 2017-03-09

Family

ID=58187154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/003969 WO2017038090A1 (en) 2015-09-01 2016-08-31 Image processing device, image processing method, and recording medium whereon image processing program is stored

Country Status (2)

Country Link
JP (1) JP6744317B2 (en)
WO (1) WO2017038090A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026408A1 (en) * 2017-08-02 2019-02-07 日本電気航空宇宙システム株式会社 Image conversion device, image processing device, image processing method, and non-transient computer-readable medium storing image processing program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009047516A (en) * 2007-08-17 2009-03-05 Pasuko:Kk Image formation method and program for reading feature information
JP2010511258A (en) * 2006-12-01 2010-04-08 ハリス コーポレイション Structured smoothing for super-resolution of multispectral images based on aligned panchromatic images
JP2011100426A (en) * 2009-11-09 2011-05-19 Iwate Univ Image processing device and method
JP2013109759A (en) * 2011-11-18 2013-06-06 Mitsubishi Electric Corp Method for pan-sharpening panchromatic image and multispectral image using dictionary

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010511258A (en) * 2006-12-01 2010-04-08 ハリス コーポレイション Structured smoothing for super-resolution of multispectral images based on aligned panchromatic images
JP2009047516A (en) * 2007-08-17 2009-03-05 Pasuko:Kk Image formation method and program for reading feature information
JP2011100426A (en) * 2009-11-09 2011-05-19 Iwate Univ Image processing device and method
JP2013109759A (en) * 2011-11-18 2013-06-06 Mitsubishi Electric Corp Method for pan-sharpening panchromatic image and multispectral image using dictionary

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NAOKO TSUKAMOTO ET AL.: "Fusion of Satellite Imagery Using IHS Transform", IEICE TECHNICAL REPORT, vol. 111, no. 431, 2 February 2012 (2012-02-02), pages 123 - 124, XP011332359 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026408A1 (en) * 2017-08-02 2019-02-07 日本電気航空宇宙システム株式会社 Image conversion device, image processing device, image processing method, and non-transient computer-readable medium storing image processing program
JPWO2019026408A1 (en) * 2017-08-02 2020-07-02 日本電気航空宇宙システム株式会社 Image conversion device, image processing device, image processing method, and image processing program
JP7264483B2 (en) 2017-08-02 2023-04-25 日本電気航空宇宙システム株式会社 Image conversion device, image processing device, image processing method, and image processing program

Also Published As

Publication number Publication date
JP6744317B2 (en) 2020-08-19
JPWO2017038090A1 (en) 2018-06-21

Similar Documents

Publication Publication Date Title
Kang et al. Pansharpening with matting model
US8059911B2 (en) Depth-based image enhancement
González-Audícana et al. A low computational-cost method to fuse IKONOS images using the spectral response function of its sensors
EP2102815B1 (en) Method of sharpening using panchromatic pixels
US9002127B2 (en) Image dynamic range compression system, method and program
EP2833317B1 (en) Image display device and/or method therefor
US9870600B2 (en) Raw sensor image and video de-hazing and atmospheric light analysis methods and systems
JP6012408B2 (en) Pan-sharpening panchromatic and multispectral images using dictionaries
US9129403B2 (en) Method and system for generating enhanced images
KR101821285B1 (en) Apparatus and method for thermal image enhancement
US20100008574A1 (en) Image processing method
JP4498361B2 (en) How to speed up Retinex-type algorithms
CN109074637B (en) Method and system for generating an output image from a plurality of respective input image channels
JP2004127064A (en) Image processing method, image processor, image processing program and image recording device
JP5235759B2 (en) Image processing apparatus, image processing method, and program
Hnatushenko et al. PANSHARPENING TECHNOLOGY OF HIGH RESOLUTION MULTISPECTRAL AND PANCHROMATIC SATELLITE IMAGES.
JP2018036960A (en) Image similarity level calculation device, image processing device, image processing method and recording medium
Ghimire et al. Color image enhancement in HSV space using nonlinear transfer function and neighborhood dependent approach with preserving details
JP2005196270A (en) Image processing method, image processing equipment, and image processing program
JP6744317B2 (en) Image processing apparatus, image processing method, and image processing program
JP2015125498A (en) Pseudo colorization image processing system
JP4728411B2 (en) Method for reducing color bleed artifacts in digital images
JP7437921B2 (en) Image processing device, image processing method, and program
JP2005182232A (en) Luminance correcting device and method
Eerola et al. Full reference printed image quality: Measurement framework and statistical evaluation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16841124

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017537550

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16841124

Country of ref document: EP

Kind code of ref document: A1