EP2672691B1 - Image processing apparatus and image processing method - Google Patents
Image processing apparatus and image processing method Download PDFInfo
- Publication number
- EP2672691B1 EP2672691B1 EP13170923.0A EP13170923A EP2672691B1 EP 2672691 B1 EP2672691 B1 EP 2672691B1 EP 13170923 A EP13170923 A EP 13170923A EP 2672691 B1 EP2672691 B1 EP 2672691B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- luminance
- gradation
- data
- processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/40093—Modification of content of picture, e.g. retouching
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6027—Correction or control of colour gradation or colour contrast
Definitions
- the present invention relates to an image processing apparatus and an image processing method that add an effect such as a watercolor painting to digital image data.
- Image processing can be performed to add a taste of a watercolor painting to digital image data.
- Features of the watercolor paintings may include a light color typical of the watercolor painting, a feeling of color bleeding, and further, a remaining rough line drawing.
- “Photoshop Watercolor Effect tutorial: From a Photo to a Masterpiece in 5 Steps” http://www.photoble.com/photoshoptutorials/photoshop-watercolor-effect-tutorial, provides details on how to achieve such an effect manually.
- the Laid-Open No. 2008-242533 proposes a method that can express the color bleeding feeling of the paints. Further, in Japanese Patent Application Laid-Open No. 11-232441 , images are spatially grouped according to a similar color and color subtraction is performed in the same group so that the images are represented by one same color. Thus, the Laid-Open No. 11-232441 proposes a method that expresses a taste of the watercolor painting which is realized by limited paints.
- the present invention is directed to an image processing apparatus and an image processing method that authentically depict a rough line drawing and implement image processing for generating a color of watercolor paintings having a color of an input image as a basic color.
- the present invention in its first aspect provides an image processing apparatus as specified in claims 1 to 11.
- the present invention in its second aspect provides an image processing method as specified in claim 12.
- FIGS. 1A and 1B are block diagrams illustrating an image processing apparatus according to a first exemplary embodiment.
- Figs. 2A to 2G are image graphics of an image after processing in each step of watercolor-like processing according to the first exemplary embodiment.
- Figs. 3A , 3B , and 3C are flowchart illustrating an operation of the watercolor-like processing according to the first exemplary embodiment.
- an example image processing apparatus to which the present invention may be applied includes an image processing apparatus having an image pickup system, such as a digital camera, a scanner, or the like.
- an image processing apparatus having an image pickup system, such as a digital camera, a scanner, or the like.
- the exemplary embodiment of the present invention is not limited thereto and may be applied to image data including color signals of a plurality of colors.
- the exemplary embodiment is not particularly limited to any specific apparatus if the image processing apparatus is capable of processing the image data. That is, the image processing apparatus may be an information processing apparatus such as a PC or an image forming apparatus such as a portable information terminal, and a printer. This is similarly applied to each exemplary embodiment described below.
- Fig. 1A illustrates a block diagram of a digital camera which is one example of an image processing apparatus 1 according to a first exemplary embodiment.
- Luminous flux from an object passes through an image forming optical system 101 including a lens, a diaphragm, and the like, is incident on an image pickup device 102, and photoelectrically converted to become an electrical signal, and thus, is output from the image pickup device 102.
- the image pickup device 102 for example, a single plate color image pickup device having a general primary-color color filter is used.
- the primary-color color filter is constituted by three types of color filters having transmission dominant wavelength ranges around 650 nm, 550 nm, and 450 nm, and captures color planes corresponding to respective bands of red (R), green (G), and blue (B),.
- the color filters are spatially arranged in a mosaic shape for each pixel. Since each pixel acquires intensity in a single color plane, a color mosaic image is output from the image pickup device 102.
- An A/D conversion unit 103 converts an analog electrical signal output from the image pickup device into digital image data. In the exemplary embodiment, image data of 12 bits is generated for each pixel at this time.
- a white balance processing unit 104 performs processing for setting a color of an area which is originally regarded as white in an image, to a white color. More specifically, gains which cause pixel values of colors R, G, and B of the area which needs to be white, to be the same value, are applied to the R, G, and B, respectively.
- a color interpolation unit 105 interpolates the image data output from the white balance processing unit 104 with respect to the color mosaic image of the R, G, and B to generate color image data having color information of the R, G, and B in all pixels.
- a matrix conversion unit 106 performs color conversion processing including color space conversion by a matrix operation.
- a gamma conversion unit 107 performs color conversion processing including gradation conversion by a gamma curve.
- color spaces of the R, G, and B are converted into color spaces of 8-bit luminance (Y) data and color difference (U, V) data, and output from the matrix conversion unit 106 as YUV data.
- an image adjusting unit 108 performs processing for enhancing an appearance of an image.
- the above processing primarily performs image correction processing such as noise reduction processing by a low-pass filter, chroma stressing and color correction processing by adding a gain, and edge stressing processing of extracting a high-frequency component by a high-pass filter, or the like and stressing the extracted high-frequency component.
- the image adjustment method is not limited to the exemplified method.
- color adjustment depending on an image shooting condition 1099 or the object is performed by acquiring a total balance using the matrix conversion unit 106, the gamma conversion unit 107, and the image adjusting unit 108.
- a watercolor-like processing unit 109 performs watercolor-like image processing to be described below on the image data output from the image adjusting unit 108.
- a compression unit 110 compresses the image data output from the watercolor-like processing unit 109 by a predetermined method such as JPEG and a recording unit 111 records the compressed image data in a recording medium such as a flash memory.
- the image data output from the watercolor-like processing unit 109 is output also to a display unit 112 and displayed on a display medium such as liquid crystal.
- a display medium such as liquid crystal.
- gamma processing or the like suitable for the display medium may be performed.
- the image data output from the watercolor-like processing unit 109 is output also to an external output unit 113 and output to an external apparatus connected to the image processing apparatus 1 by a wired or wireless method.
- the image data output from the image adjusting unit 108 is directly input into the compression unit 110, the display unit 112, or the external output unit 113 as indicated by a dashed line.
- a memory 121 stores the image data used in each processing unit, or image shooting information such as an F value of a diaphragm, a shutter speed, ISO sensitivity, a white balance gain value, and a set-up of a color gamut such as s-RGB.
- the stored data is appropriately read out and used by an instruction from a control unit 120.
- the control unit 120 controls each unit via a BUS line and appropriately performs required operation processing.
- An external operation such as an instruction by a user is input into the image processing apparatus via an interface I/F 122 and the control unit 120 receives the input operation to perform the operation or control each unit.
- a basic watercolor-like effect is implemented by mixing an edge component extracted from an input image as a luminance component and the input image with a gradated blurring component as a color component.
- the luminance component is not comprised solely of the extracted edge component.
- a low-frequency component is retained with respect to low-luminance data in some degree to achieve the watercolor-like effect in which natural gradation is reproduced from a bright part up to a dark part.
- the luminance (Y) data among the image data input from the image adjusting unit 108 is input into a small blurring image generating unit (luminance blurring unit) 1091 and an edge extracting unit 1094, and the color difference (UV) data is input into a large blurring image generating unit (color blurring unit) 1092.
- the small blurring image generating unit 1091 and the large blurring image generating unit 1092 generate a small blurring image and a large blurring image by gradation processing (smoothing processing).
- the blurring image is an image which is blurred with respect to an input image, that is, an image in which a higher-frequency component than a predetermined frequency is removed.
- the small blurring image generated by the small blurring image generating unit 1091 is smaller in a degree of gradation than the large blurring image generated by the large blurring image generating unit 1092, and the high-frequency component remains.
- a gradation correcting unit 1093 performs gradation correction processing on the image data of the small blurring image generated by the small blurring image generating unit 1091.
- Several methods of the gradation correction processing are possible, but in the exemplary embodiment, correction by a one-dimensional look-up table (LUT) 1090 is performed in order to correct target luminance range gradation.
- An LUT selection unit 1097 selects the LUT 1090 based on an image shooting condition 1099 from LUTs 1098a to 1098c for each scene which has any one of features illustrated in Figs. 4A to 4C .
- the edge extracting unit 1094 generates an edge image from the small blurring image in which gradation is corrected, by using the input image.
- the input image may be just subtracted from the small blurring image. That is, the edge extracting unit 1094 is a subtraction circuit in the exemplary embodiment.
- a gradation inversion unit 1095 inverts gradation of the image.
- the gradation inversion unit 1095 is configured of a circuit that performs gradation processing with, for example, a general tone curve and may perform the gradation processing according to the tone curve such as performing so-called negative and positive inversion.
- the edge component may be darkened and a non-edge component may be tinged with white.
- the watercolor-like processing unit 109 uses the edge image in which gradation is inverted by the gradation inversion unit 1095 as the luminance (Y) data, and the large blurring image generated from the input image by the large blurring image generating unit 1092 as the color (UV) data, to output one image data.
- the large blurring image output from the large blurring image generating unit 1092 is used for the color component to express watercolors straying from a line of the rough line drawing and bleeding.
- the gradation correction processing performed by the gradation correcting unit 1093 in the exemplary embodiment will be described in more detail.
- an output image resulting when the gradation correcting unit 1093 does not perform the gradation correction processing is considered.
- the luminance data corresponding to the edge image acquired from (small blurring image) - (input image) is gradation-inverted to become luminance data of the final image. Therefore, the edge area is allocated to a low-luminance range and the non-edge area is allocated to a high-luminance range.
- a colored area is expressed at comparatively high luminance by adding color data (color difference data) to the luminance data, it is possible to express a light color typical of a watercolor-like image.
- color data color difference data
- the area of the original dark (low-luminance) part is also inverted to be expressed at high luminance, a strong uncomfortable feeling may be evoked, for example, the pupil of the eye of a person is inverted to become a white eye.
- the gradation correction processing by the gradation correcting unit 1093 is performed on the luminance data of the small blurring image output from the small blurring image generating unit 1091.
- gradation correction is performed by increasing the output of the dark (low luminance)part so that the component of the dark part remains in the edge image.
- the component of the area of a dark part having luminance lower than a predetermined value remains in the edge image output from the edge extracting unit 1094. That is, since an edge image (luminance data) including the high-frequency component and the low-luminance low-frequency component can be extracted from the luminance data of the input image, a watercolor-like image having a less uncomfortable feeling in the final image may be generated.
- the gradation correcting unit 1093 performs gradation correction by using an LUT having a characteristic 401 such as slightly increasing the output of the dark part.
- a horizontal axis represents an input signal and a vertical axis represents an output signal.
- the edge image output from the edge extracting unit 1094 at a subsequent stage can be made an edge image in which a certain amount of low-frequency component remains with respect to a dark image portion.
- the characteristic of the LUT dark regions are preferably retained (at increased luminance)in the edge image output from the edge extracting unit 1094 as illustrated in any one of Figs. 4A to 4C including a characteristic to be described below.
- the characteristic of the LUT as long as a pixel values of the output signal is equal to or more than the pixel value of the input signal, the output of the dark part may be increased to a medium or even high luminance range. Also in this case though, the luminance data output from the edge extracting unit 1094 includes more low-frequency components regardless of the luminance range and the effect of the rough line drawing may be diminished.
- Fig. 2 illustrates an image view of an image (data) after processing in each step of the watercolor-like processing in the watercolor-like processing unit 109.
- Fig. 2A illustrates a sample of an image constituted by YUV data which is output from the image adjusting unit 108 and input into the watercolor-like processing unit 109.
- Fig. 2B illustrates an image after gradation processing by the small blurring image generating unit 1091
- Fig. 2C illustrates an image after gradation processing by the large blurring image generating unit 1092.
- gradation processing is performed on the color difference (UV) signal from among image signals, however, an image in which a luminance signal is included also is illustrated so as more clearly to show a difference in degree of the gradation processing.
- the degree of the gradation processing for the color difference signal is set to be larger than the degree of the gradation processing for the luminance signal.
- Fig. 2D is an image graphic of the luminance data output from the gradation correcting unit 1093. In Fig. 2D , a dark part shown in a background is brighter than in Fig. 2A by the gradation correction processing of increasing the output of the dark part.
- Fig. 2E is an image graphic of the luminance data output from the edge extracting unit 1094.
- a (non-zero luminance) value is given so that an edge of a flower petal is white and a value is given also in the original dark part of the image.
- Fig. 2F is an image graphic of the luminance data output from the gradation inversion unit 1095. Referring to Fig. 2F , the dark part, together with the edge part, also remains as a dark part in the output image from the edge extracting unit 1094.
- Fig. 2G is an image graphic of a final output image of the watercolor-like processing unit 109. As can be seen from Fig. 2G , the area of the dark part in the original image is expressed as the dark part together with the edge.
- luminance gradation remains in the dark part having a low-frequency component. Therefore, the final image does not become bright and dark parts are more naturally reproduced. Meanwhile, since the final image includes the low-frequency component, a photograph-like element may remain, so that the final image retains realistic depiction and the artistic effect of the final image may be lessened. Therefore, ideally, a luminance range in which the low-frequency component remains is appropriately controlled for each scene. Therefore, in the exemplary embodiment, an LUT suitable for a shot scene is selected and applied as described below.
- Fig. 3 is a flowchart illustrating an overall operation of the watercolor-like processing performed by the watercolor-like processing unit 109 illustrated in Fig. 1B .
- Each operation of the flowchart is performed by the control unit 120 or by each unit under an instruction from the control unit 120.
- step S301 the small blurring image generating unit 1091 and the large blurring image generating unit 1092 perform small blurring image generation processing on luminance data of an input image and large blurring image generation processing on color data of the input image, respectively.
- step S302 the LUT 1090 used in the gradation correcting unit 1093 is selected and set by the LUT selection unit 1097.
- step S303 the gradation correcting unit 1093 performs gradation correction processing according to the selected LUT.
- step S304 the edge extracting unit 1094 performs the aforementioned edge extraction processing from luminance data of which gradation is corrected.
- step S305 the gradation inversion unit 1095 performs the aforementioned gradation inversion processing on the edge image output from the edge extracting unit 1094.
- step S306 the final image data is generated, in which the luminance (Y) data output from the gradation inversion unit 1095 is set as the luminance data and the color (UV) data output from the large blurring image generating unit 1092 is set as the color data, and the final image data is output as an output of the watercolor-like processing unit 109, and the process ends.
- step S301 of Fig. 3 will be described in detail with reference to the flowchart of Fig. 3B .
- shrinkage processing and enlargement processing are combined to generate a blurring image. More specifically, an image in which an information amount is reduced by performing the shrinkage processing is subsequently enlarged through interpolation, and as a result, the image becomes blurred.
- a shrinkage size of a smallest shrinkage image is set according to a target size of blurring.
- the large blurring image has a size corresponding to 1/16 of each side of the input image (the number of pixels is decreased to 1/16 on each of the length and the width).
- the input image is smoothed by applying a low-pass filter (LPF) with a filter coefficient [1, 2, 1] in the vertical direction and the horizontal direction before the shrinkage (step S3012).
- LPF low-pass filter
- the enlargement processing is performed until the input image has an original size.
- the twofold enlargement processing is also repeated in the vertical direction and the horizontal direction N times in steps S3015 to S3017 similarly to the shrinkage processing.
- side magnification in one shrinkage is carried out by a factor of 1/2, but may be carried out by a factor of 1/4 and is not limited thereto.
- the filter coefficient of the low-pass filter used at the same time needs to be appropriately changed in order to prevent the moire from occurring. For example, when the side magnification is decreased to 1/4 the filter coefficient is set to [1, 4, 6, 4, 1].
- the LUT suitable for a shot scene is selected and applied to the processing.
- gradation correction is performed by an LUT 3 having a characteristic 403 (for example, an output is increased at a darkest input portion by approximately 35%) which increases an output of a pixel value (dark part) having less than a predetermined luminance, to a greater extent as compared with other scenes, as illustrated in Fig. 4C .
- gradation correction is performed by an LUT 2 having a characteristic 402 (for example, an output is increased at the darkest input portion by approximately 13%) which increases a pixel value (dark part) having less than a predetermined luminance, to a lesser extent as compared with other scenes, as illustrated in Fig. 4B .
- gradation correction is performed by an LUT 1 having a characteristic 401 (for example, the output is increased at the darkest input portion by approximately 23%) which increases the output of the dark part to an average extent, as illustrated in Fig. 4A .
- a characteristic 401 for example, the output is increased at the darkest input portion by approximately 236% which increases the output of the dark part to an average extent, as illustrated in Fig. 4A .
- any LUT which has a characteristic that performs gradation correction in which a pixel value as an output value is equal to or more than a pixel value as an input value may be appropriate.
- program priority (P), diaphragm priority (Av), time priority (Tv), manual (M) or the like is selectable by a mode dial or the like through an I/F 122 as an image shooting mode.
- a mode dial or the like is selectable by a mode dial or the like through an I/F 122 as an image shooting mode.
- a categorized image shooting mode considering a kind of a scene such as a full automatic, portrait mode, a landscape mode, a sport mode, a macro, and a nightscape is installed.
- step S3021 the control unit 120, for example, reads out an image shooting mode M from the memory 121 as an image shooting condition 1099.
- an image shooting condition 1099 set at the time of shooting the input image is stored in the memory 121 and read out.
- the control unit 120 determines the read-out image shooting mode M and selects the corresponding LUT 1090 from the LUT 1098 for each scene stored in the memory 121.
- the LUT 1, the LUT 2, and the standard LUT 3 are selected in the case of the portrait mode, the landscape mode, and other modes, respectively.
- the LUT 1098 for each scene is stored in advance, and as a result, calculation processing is reduced in image shooting, and high-speed continuous shooting may be performed without decreasing an image shooting frame speed.
- the present invention is not limited the particular method described above. For example, image data generated by apparatuses other than the image processing apparatus, which is stored in the memory 121 may be processed. In this case, a header or an image shooting condition 1099 which is stored in association is read out. In particular, when a determinable image shooting condition 1099 is not stored, the LUT 3 for the standard case is selected.
- step S3023 the control unit 120 sets the selected LUT 1090 to the gradation correcting unit 1093 and returns to the main processing.
- the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image.
- the rough line drawing is authentically depicted and image processing for generating the watercolor painting with the color of the input image as a basic color may be implemented.
- the watercolor-like image may be generated with a way which is appropriate for each scene.
- the gradation correcting unit 1093 when the gradation correcting unit 1093 performs gradation correction, the LUT for each scene is held as a preset, and the LUT is selected and gradation is corrected according to the image shooting condition of the scene, but in the second exemplary embodiment, the scene is determined by analyzing the input image and the gradation-correct LUT is calculated.
- Fig. 5 is a block diagram illustrating the watercolor-like processing unit 109 in detail according to a second exemplary embodiment. Since processing details of a block having the same reference numeral as Fig. 1B are the same as Fig. 1B , a description of the block will be omitted.
- the second exemplary embodiment is different from the first exemplary embodiment in that an LUT calculation unit 501 is provided, an input image is analyzed by the LUT calculation unit 501 to determine a scene and an appropriate LUT 502 is calculated which corresponds to the determined scene.
- Fig. 6 is a flowchart illustrating an operating example of the LUT selection processing in step S302 of Fig. 3A according to the exemplary embodiment.
- Other overall watercolor-like processing is similar to the operation illustrated in Fig. 3 .
- step S601 the control unit 120 and the LUT calculation unit 501 first perform portrait determination processing in order to calculate face reliability which indicates a probability that the scene is a portrait scene.
- step S602 the control unit 120 and the LUT calculation unit 501 perform landscape determination processing in order to calculate a landscape degree which indicates a probability that the scene is a landscape scene.
- step S603 an LUT suitable for the input image is calculated by using a determination result in steps S601 and S602. More specifically, when the face reliability is 100%, the LUT illustrated in Fig. 4C is set. Meanwhile, when the face reliability is 0%, the LUT illustrated in Fig. 4B is set. When the face reliability is between them, the face reliability is interpolated and calculated from the LUTs illustrated in Figs. 4B and 4C .
- the LUT illustrated in Fig. 4A When the landscape degree is 100%, the LUT illustrated in Fig. 4A is set. Meanwhile, when the landscape degree is 0%, the LUT illustrated in Fig. 4B is set. When the landscape degree is between them, the landscape degree is interpolated and calculated from the LUTs illustrated in Figs. 4A and 4B according to the landscape degree.
- the LUT is set by prioritizing the face reliability.
- the calculated LUT is set in the gradation correcting unit 1093 in step S3023 similarly as in the first exemplary embodiment.
- Fig. 6B is a flowchart illustrating an operation of the portrait determination processing in step S601.
- face detection is performed by using a means, such as face identification by a Haar-Like feature amount (step S6011). Thereafter, the face reliability is calculated from the feature amount and returns to the main processing (step S6012).
- a reliable predetermined feature amount is set in advance to represent a face reliability of 100%, and in actual processing, the normalized face reliability using the feature amount is calculated.
- Fig. 6C is a flowchart illustrating an operating example of the landscape determination processing in step S602.
- a method of identifying a face by analyzing a histogram of a color signal of an input image, or the like may also be used ( Fig. 8C ).
- RGB of an input signal is converted into signals in HSB color space (step S6021).
- a hue histogram is calculated in an H component of an input image (step S6022).
- An example of the histogram calculated as above described is illustrated in Fig. 7 , and the histogram is analyzed to determine whether the number of peaks having a threshold (th) or more is one or more (step S6023). When no mountain is detected ("NO" in S6023), the processing ends at the landscape degree of 0% (step S6028).
- a variable i is set presuming that the number of all detected peaks is N, in order to perform scene determination with respect to all of the detected peaks (step S6024).
- An initial value of the variable i is set as "1” and until the variable i is "N", the variable i is incremented by each "1".
- a green level G for the hue of a first mountain is calculated (step S6025).
- a condition of H is in the range GR to GL is met, the green level G is 100% and as the green level G moves away from this condition, the green level G is decreased.
- green level G i > green level G i-1 , the green level G is updated to G i .
- a blue level B regarding whether the hue of the mountain is a blue sky is calculated (step S6026).
- the blue level B is 100% and as the blue level B moves away from this condition, the blue level B is decreased.
- the blue level B i > blue level B i-1 the blue level B is updated to B i .
- it is determined whether hues of all of the peaks have been examined (step S6027).
- the processing returns to step S6025 in order to examine a hue of a subsequent mountain.
- the landscape degree L is calculated from the green level G and the blue level B by an equation below and returns to the main processing (S6028).
- L G % + B % 200 %
- the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image.
- the rough line drawing is authentically depicted and image processing of generating the watercolor painting with the color of the input image as a basic color may be implemented.
- the input image is analyzed to determine the shot scene and the gradation correction is performed with an LUT which is optimized for each scene to retain an optimal dark part of the scene, so that the watercolor-like effect may be accurately produced.
- the watercolor-like image is generated by using the extracted edge component as the luminance component.
- the luminance component is generated apart from the edge component.
- Fig. 8 is a block diagram of the watercolor-like processing unit 109 implementing watercolor-like processing according to a third exemplary embodiment. Since a block with the same reference numeral as Fig. 1B performs the same as Fig. 1B , a description of the block will be omitted.
- the exemplary embodiment is different from the first and second exemplary embodiments in that luminance data input into an edge extracting unit 803 is not gradation-corrected.
- the edge component output from the edge extracting unit is an almost pure high-frequency component which includes almost no low-frequency component.
- a gradation correcting unit 801 performs gradation correction so that out of luminance data of an input image, a low-frequency component smaller than predetermined luminance remains.
- a method of the gradation correction is performed by conversion using a one-dimensional LUT 802 similarly to the first and second exemplary embodiments.
- an LUT which follows a characteristic of 902 of Fig 9 is referenced as the LUT 1
- an LUT which follows a characteristic of 903 is referenced as the LUT 2
- an LUT which follows a characteristic of 901 is referenced as an LUT 3.
- An addition unit 804 adds a luminance component (luminance data) and an edge component (edge data) after gradation correction is performed.
- the addition unit 804 will be described in detail with reference to Fig. 10 .
- An offset noise value and a gain value are read out from the memory 121 in advance.
- a subtracter 8041 subtracts a small offset noise value 2056 received from edge data, from the edge extracting unit 803.
- a small offset noise component can be removed. Since the small noise appears in the bright part after gradation inversion processing is performed and is not appropriate to express light transparence which is a feature of the watercolor-like, the minute noise is desirably removed in this step.
- a multiplier 8042 multiplies a gain value 2057 by edge data 8045 in which the offset noise value is reduced.
- the intensity of the rough line drawing may be adjusted by the gain value.
- the edge data 8045 multiplied by the gain value and luminance data 8044 after gradation correction are added to acquire a final luminance component (luminance data).
- the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image.
- the rough line drawing is authentically depicted and image processing of generating the watercolor effect with the color of the input image as a basic color may be implemented.
- the edge component and the luminance component are separated from each other, so that an edge amount and a remaining amount of the low-frequency component may be independently adjusted. Meanwhile, a processing overhead increases as compared with the first and second exemplary embodiments. Therefore, it is effective that in a rear mounted liquid crystal monitor, an edge component having a small calculation amount is commonly used together with a luminance component to perform display in a simplified manner. On the other hand, at the time of recording, the edge component and the luminance component may be separated and used separately.
- a fourth exemplary embodiment is a method for acquiring an equivalent appearance (effect) in viewing images of different sizes, even when a plurality of image sizes of a target image subjected to watercolor-like processing is set.
- gradation processing of the image when image sizes (pixel number) of input images are different from each other, a blurring degree seems to be different even in the same shrinkage size.
- shrinkage sizes required to generate the small blurring image and the large blurring image are set appropriate according to the image size of the input image.
- Fig. 11 is a block diagram of an image processing apparatus according to a fourth exemplary embodiment and Fig. 12 is a flowchart of blurring image generation processing according to the fourth exemplary embodiment.
- a resize unit 112 is provided at a front side stage of the watercolor-like processing unit 109.
- Fig. 12 is different from the flowchart of the blurring image generation processing of Fig. 3B according to the first exemplary embodiment in that steps S1201 and S1202 are provided.
- step S1201 an image size of an input image is read out as an image shooting condition 1099 and in step S1202, the number of shrinkage times N1 in the small blurring image generating unit and the number of shrinkage times N2 in the large blurring image generating unit are set depending on the image size illustrated in Fig. 13 . Subsequent processing is performed similarly as in the first exemplary embodiment according to the set N1 and N2.
- one time shrinkage represents 1/2 shrinkage. As the image size is decreased, the number of shrinkage times is reduced.
- the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image.
- the rough line drawing is authentically depicted and image processing of generating the watercolor image with the color of the input image as a basic color may be implemented.
- the number of shrinkage times when generating the blurring image is set according to the image size of the input image, so that the similar watercolor-like effect can be achieved even in viewing with the same image size a plurality of images having different image sizes.
- each block of the watercolor-like processing unit 109 has been described. However, since any operation of each block can be implemented by software, some or all of the operations of the watercolor-like processing unit 109 may be installed by software processing. Further, similarly, some or all of operations of other blocks in the image processing apparatus of Fig. 1 may be installed by the software processing.
- gradation correction is performed by the one-dimensional LUT in the gradation correcting units 1093 and 801.
- the method of the gradation correction is not limited thereto. So long as the gradation correction having the equivalent effect of the features illustrated in Figs. 4 and 9 is performed, the present invention is applicable.
- an output pixel value may be obtained by a calculation operation.
- the YUV data divided into the luminance data and the color data in other words, the data after development processing, is used as an example as the input image for the processing.
- the present invention is not limited thereto and may be applied also to image data including data of each color of RGB.
- the processing described in each exemplary embodiment may be performed.
- Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present invention, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment (s).
- the computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Description
- The present invention relates to an image processing apparatus and an image processing method that add an effect such as a watercolor painting to digital image data.
- Image processing can be performed to add a taste of a watercolor painting to digital image data. Features of the watercolor paintings may include a light color typical of the watercolor painting, a feeling of color bleeding, and further, a remaining rough line drawing. "Photoshop Watercolor Effect Tutorial: From a Photo to a Masterpiece in 5 Steps", http://www.photoble.com/photoshoptutorials/photoshop-watercolor-effect-tutorial, provides details on how to achieve such an effect manually.
- In Japanese Patent Application Laid-Open No.
2008-242533 - Thus, the Laid-Open No.
2008-242533
No.11-232441 11-232441 - In the method of Japanese Patent Application Laid-Open No.
2008-242533 11-232441 - The present invention is directed to an image processing apparatus and an image processing method that authentically depict a rough line drawing and implement image processing for generating a color of watercolor paintings having a color of an input image as a basic color.
- The present invention in its first aspect provides an image processing apparatus as specified in
claims 1 to 11. - The present invention in its second aspect provides an image processing method as specified in claim 12.
- Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Figs. 1A and1B are block diagrams illustrating an image processing apparatus according to a first exemplary embodiment.Figs. 2A to 2G are image graphics of an image after processing in each step of watercolor-like processing according to the first exemplary embodiment.Figs. 3A ,3B , and3C are flowchart illustrating an operation of the watercolor-like processing according to the first exemplary embodiment. -
Figs. 4A, 4B, and 4C are diagrams illustrating an input/output feature of an LUT used in gradation correction. -
Fig. 5 is a block diagram illustrating a watercolor-like processing unit according to a second exemplary embodiment. -
Figs. 6A ,6B , and6C are flowchart illustrating an operation of LUT selection processing according to the second exemplary embodiment. -
Fig. 7 is a diagram illustrating a histogram used in landscape determination according to the second exemplary embodiment. -
Fig. 8 is a block diagram illustrating a watercolor-like processing unit according to a third exemplary embodiment. -
Fig. 9 is a diagram illustrating an input/output feature of an LUT used in gradation correction according to the third exemplary embodiment. -
Fig. 10 is a block diagram illustrating an addition unit according to the third exemplary embodiment. -
Fig. 11 is a block diagram illustrating an image processing apparatus according to a fourth exemplary embodiment. -
Fig. 12 is a flowchart illustrating an operation of blurring image generation processing according to the fourth exemplary embodiment. -
Fig. 13 is a table illustrating an example in which shrinkage times in generating blurring image correspond to an image size. - Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
- Hereinafter, a first exemplary embodiment of the present invention will be described.
- In the exemplary embodiment, an example image processing apparatus to which the present invention may be applied includes an image processing apparatus having an image pickup system, such as a digital camera, a scanner, or the like. However, since the exemplary embodiment of the present invention is not limited thereto and may be applied to image data including color signals of a plurality of colors. The exemplary embodiment is not particularly limited to any specific apparatus if the image processing apparatus is capable of processing the image data. That is, the image processing apparatus may be an information processing apparatus such as a PC or an image forming apparatus such as a portable information terminal, and a printer. This is similarly applied to each exemplary embodiment described below.
-
Fig. 1A illustrates a block diagram of a digital camera which is one example of animage processing apparatus 1 according to a first exemplary embodiment. Luminous flux from an object passes through an image formingoptical system 101 including a lens, a diaphragm, and the like, is incident on animage pickup device 102, and photoelectrically converted to become an electrical signal, and thus, is output from theimage pickup device 102. As theimage pickup device 102, for example, a single plate color image pickup device having a general primary-color color filter is used. - The primary-color color filter is constituted by three types of color filters having transmission dominant wavelength ranges around 650 nm, 550 nm, and 450 nm, and captures color planes corresponding to respective bands of red (R), green (G), and blue (B),. In the single plate color image pickup device, the color filters are spatially arranged in a mosaic shape for each pixel. Since each pixel acquires intensity in a single color plane, a color mosaic image is output from the
image pickup device 102. An A/D conversion unit 103 converts an analog electrical signal output from the image pickup device into digital image data. In the exemplary embodiment, image data of 12 bits is generated for each pixel at this time. - A white
balance processing unit 104 performs processing for setting a color of an area which is originally regarded as white in an image, to a white color. More specifically, gains which cause pixel values of colors R, G, and B of the area which needs to be white, to be the same value, are applied to the R, G, and B, respectively. - A
color interpolation unit 105 interpolates the image data output from the whitebalance processing unit 104 with respect to the color mosaic image of the R, G, and B to generate color image data having color information of the R, G, and B in all pixels. - A
matrix conversion unit 106 performs color conversion processing including color space conversion by a matrix operation. Agamma conversion unit 107 performs color conversion processing including gradation conversion by a gamma curve. In the exemplary embodiment, through the processing carried out by thematrix conversion unit 106, color spaces of the R, G, and B are converted into color spaces of 8-bit luminance (Y) data and color difference (U, V) data, and output from thematrix conversion unit 106 as YUV data. - Further, an
image adjusting unit 108 performs processing for enhancing an appearance of an image. For example, the above processing primarily performs image correction processing such as noise reduction processing by a low-pass filter, chroma stressing and color correction processing by adding a gain, and edge stressing processing of extracting a high-frequency component by a high-pass filter, or the like and stressing the extracted high-frequency component. The image adjustment method is not limited to the exemplified method. - In addition, color adjustment depending on an
image shooting condition 1099 or the object is performed by acquiring a total balance using thematrix conversion unit 106, thegamma conversion unit 107, and theimage adjusting unit 108. In the exemplary embodiment, when the shot image is set to an image shooting mode to perform watercolor-like processing, a watercolor-like processing unit 109 performs watercolor-like image processing to be described below on the image data output from theimage adjusting unit 108. - A
compression unit 110 compresses the image data output from the watercolor-like processing unit 109 by a predetermined method such as JPEG and arecording unit 111 records the compressed image data in a recording medium such as a flash memory. - The image data output from the watercolor-
like processing unit 109 is output also to adisplay unit 112 and displayed on a display medium such as liquid crystal. In this case, gamma processing or the like suitable for the display medium may be performed. - The image data output from the watercolor-
like processing unit 109 is output also to anexternal output unit 113 and output to an external apparatus connected to theimage processing apparatus 1 by a wired or wireless method. - When the image data is set in a general image shooting mode in which the watercolor-like processing is not performed, the image data output from the
image adjusting unit 108 is directly input into thecompression unit 110, thedisplay unit 112, or theexternal output unit 113 as indicated by a dashed line. - A
memory 121 stores the image data used in each processing unit, or image shooting information such as an F value of a diaphragm, a shutter speed, ISO sensitivity, a white balance gain value, and a set-up of a color gamut such as s-RGB. The stored data is appropriately read out and used by an instruction from acontrol unit 120. - The
control unit 120 controls each unit via a BUS line and appropriately performs required operation processing. - An external operation, such as an instruction by a user is input into the image processing apparatus via an interface I/
F 122 and thecontrol unit 120 receives the input operation to perform the operation or control each unit. - Hereinafter, an image processing method of watercolor-like processing of the exemplary embodiment of the present invention, and a configuration of an image processing circuit implementing the image processing method will be described with reference to the block diagram illustrating details of the watercolor-
like processing unit 109 ofFig. 1B . In the exemplary embodiment, a basic watercolor-like effect is implemented by mixing an edge component extracted from an input image as a luminance component and the input image with a gradated blurring component as a color component. However, the luminance component is not comprised solely of the extracted edge component. A low-frequency component is retained with respect to low-luminance data in some degree to achieve the watercolor-like effect in which natural gradation is reproduced from a bright part up to a dark part. - First, the luminance (Y) data among the image data input from the
image adjusting unit 108 is input into a small blurring image generating unit (luminance blurring unit) 1091 and anedge extracting unit 1094, and the color difference (UV) data is input into a large blurring image generating unit (color blurring unit) 1092. The small blurringimage generating unit 1091 and the large blurringimage generating unit 1092 generate a small blurring image and a large blurring image by gradation processing (smoothing processing). Herein, the blurring image is an image which is blurred with respect to an input image, that is, an image in which a higher-frequency component than a predetermined frequency is removed. The small blurring image generated by the small blurringimage generating unit 1091 is smaller in a degree of gradation than the large blurring image generated by the large blurringimage generating unit 1092, and the high-frequency component remains. - Several methods of generating the blurring images are possible. For example, smoothing is vertically and laterally performed with a low-pass filter using a Gaussian filter coefficient. However, in order to implement a large blurring state anticipated in the watercolor-like processing in a single smoothing processing, a kernel size of the low-pass filter is increased, and as a result, a processing time becomes great. That is, it is not realistic to perform the processing on hardware of a camera. Therefore, in the exemplary embodiment, in order to shorten the processing time and acquire desired blurring, a shrinkage processing circuit and an enlargement processing circuit are combined with each other to generate the small blurring image and the large blurring image. A detailed operation associated with the blurring image generation processing will be described below by using the flowchart of
Fig. 3B . - A
gradation correcting unit 1093 performs gradation correction processing on the image data of the small blurring image generated by the small blurringimage generating unit 1091. Several methods of the gradation correction processing are possible, but in the exemplary embodiment, correction by a one-dimensional look-up table (LUT) 1090 is performed in order to correct target luminance range gradation. AnLUT selection unit 1097 selects theLUT 1090 based on animage shooting condition 1099 fromLUTs 1098a to 1098c for each scene which has any one of features illustrated inFigs. 4A to 4C . - The
edge extracting unit 1094 generates an edge image from the small blurring image in which gradation is corrected, by using the input image. Herein, in order to extract the edge component from the input image, the input image may be just subtracted from the small blurring image. That is, theedge extracting unit 1094 is a subtraction circuit in the exemplary embodiment. - An image in which an area of two images having different frequency bands may thus be generated as an edge image or component. In the edge image thus generated, the edge component has a value according to the intensity of the edge component and a low-frequency component which is not an edge has a signal close to 0 (8-bit range). In order to make the image have an appearance of a rough line drawing, a
gradation inversion unit 1095 inverts gradation of the image. Thegradation inversion unit 1095 is configured of a circuit that performs gradation processing with, for example, a general tone curve and may perform the gradation processing according to the tone curve such as performing so-called negative and positive inversion. By gradation inversion, the edge component may be darkened and a non-edge component may be tinged with white. - Thus, the watercolor-
like processing unit 109 uses the edge image in which gradation is inverted by thegradation inversion unit 1095 as the luminance (Y) data, and the large blurring image generated from the input image by the large blurringimage generating unit 1092 as the color (UV) data, to output one image data. - The large blurring image output from the large blurring
image generating unit 1092 is used for the color component to express watercolors straying from a line of the rough line drawing and bleeding. - Herein, the gradation correction processing performed by the
gradation correcting unit 1093 in the exemplary embodiment will be described in more detail. First, an output image resulting when thegradation correcting unit 1093 does not perform the gradation correction processing is considered. The luminance data corresponding to the edge image acquired from (small blurring image) - (input image) is gradation-inverted to become luminance data of the final image. Therefore, the edge area is allocated to a low-luminance range and the non-edge area is allocated to a high-luminance range. - Since a colored area is expressed at comparatively high luminance by adding color data (color difference data) to the luminance data, it is possible to express a light color typical of a watercolor-like image. However, since the area of the original dark (low-luminance) part is also inverted to be expressed at high luminance, a strong uncomfortable feeling may be evoked, for example, the pupil of the eye of a person is inverted to become a white eye.
- Accordingly, in the exemplary embodiment, the gradation correction processing by the
gradation correcting unit 1093 is performed on the luminance data of the small blurring image output from the small blurringimage generating unit 1091. At that time, gradation correction is performed by increasing the output of the dark (low luminance)part so that the component of the dark part remains in the edge image. As a result, the component of the area of a dark part having luminance lower than a predetermined value remains in the edge image output from theedge extracting unit 1094. That is, since an edge image (luminance data) including the high-frequency component and the low-luminance low-frequency component can be extracted from the luminance data of the input image, a watercolor-like image having a less uncomfortable feeling in the final image may be generated. - More specifically, as illustrated in
Fig. 4A , thegradation correcting unit 1093 performs gradation correction by using an LUT having a characteristic 401 such as slightly increasing the output of the dark part. Herein, a horizontal axis represents an input signal and a vertical axis represents an output signal. - By performing the gradation correction having the characteristic 401 of
Fig. 4A , the edge image output from theedge extracting unit 1094 at a subsequent stage can be made an edge image in which a certain amount of low-frequency component remains with respect to a dark image portion. - As a result of the characteristic of the LUT, dark regions are preferably retained (at increased luminance)in the edge image output from the
edge extracting unit 1094 as illustrated in any one ofFigs. 4A to 4C including a characteristic to be described below. However, as the characteristic of the LUT, as long as a pixel values of the output signal is equal to or more than the pixel value of the input signal, the output of the dark part may be increased to a medium or even high luminance range. Also in this case though, the luminance data output from theedge extracting unit 1094 includes more low-frequency components regardless of the luminance range and the effect of the rough line drawing may be diminished. -
Fig. 2 illustrates an image view of an image (data) after processing in each step of the watercolor-like processing in the watercolor-like processing unit 109.Fig. 2A illustrates a sample of an image constituted by YUV data which is output from theimage adjusting unit 108 and input into the watercolor-like processing unit 109.Fig. 2B illustrates an image after gradation processing by the small blurringimage generating unit 1091 andFig. 2C illustrates an image after gradation processing by the large blurringimage generating unit 1092. InFig. 2C , gradation processing is performed on the color difference (UV) signal from among image signals, however, an image in which a luminance signal is included also is illustrated so as more clearly to show a difference in degree of the gradation processing. - As illustrated in
Figs. 2B and 2C , in the exemplary embodiment, the degree of the gradation processing for the color difference signal is set to be larger than the degree of the gradation processing for the luminance signal.Fig. 2D is an image graphic of the luminance data output from thegradation correcting unit 1093. InFig. 2D , a dark part shown in a background is brighter than inFig. 2A by the gradation correction processing of increasing the output of the dark part. -
Fig. 2E is an image graphic of the luminance data output from theedge extracting unit 1094. InFig. 2E , for example, a (non-zero luminance) value is given so that an edge of a flower petal is white and a value is given also in the original dark part of the image.Fig. 2F is an image graphic of the luminance data output from thegradation inversion unit 1095. Referring toFig. 2F , the dark part, together with the edge part, also remains as a dark part in the output image from theedge extracting unit 1094.Fig. 2G is an image graphic of a final output image of the watercolor-like processing unit 109. As can be seen fromFig. 2G , the area of the dark part in the original image is expressed as the dark part together with the edge. - As described above, in the final image acquired by performing the gradation correction processing by the characteristics illustrated in
Figs. 4A to 4C , luminance gradation remains in the dark part having a low-frequency component. Therefore, the final image does not become bright and dark parts are more naturally reproduced. Meanwhile, since the final image includes the low-frequency component, a photograph-like element may remain, so that the final image retains realistic depiction and the artistic effect of the final image may be lessened. Therefore, ideally, a luminance range in which the low-frequency component remains is appropriately controlled for each scene. Therefore, in the exemplary embodiment, an LUT suitable for a shot scene is selected and applied as described below. -
Fig. 3 is a flowchart illustrating an overall operation of the watercolor-like processing performed by the watercolor-like processing unit 109 illustrated inFig. 1B . Each operation of the flowchart is performed by thecontrol unit 120 or by each unit under an instruction from thecontrol unit 120. - In step S301, the small blurring
image generating unit 1091 and the large blurringimage generating unit 1092 perform small blurring image generation processing on luminance data of an input image and large blurring image generation processing on color data of the input image, respectively. - In step S302, the
LUT 1090 used in thegradation correcting unit 1093 is selected and set by theLUT selection unit 1097. In step S303, thegradation correcting unit 1093 performs gradation correction processing according to the selected LUT. - In step S304, the
edge extracting unit 1094 performs the aforementioned edge extraction processing from luminance data of which gradation is corrected. - In step S305, the
gradation inversion unit 1095 performs the aforementioned gradation inversion processing on the edge image output from theedge extracting unit 1094. - In step S306, the final image data is generated, in which the luminance (Y) data output from the
gradation inversion unit 1095 is set as the luminance data and the color (UV) data output from the large blurringimage generating unit 1092 is set as the color data, and the final image data is output as an output of the watercolor-like processing unit 109, and the process ends. - The blurring image generation processing of step S301 of
Fig. 3 will be described in detail with reference to the flowchart ofFig. 3B . As described above, in the blurring image generation processing, shrinkage processing and enlargement processing are combined to generate a blurring image. More specifically, an image in which an information amount is reduced by performing the shrinkage processing is subsequently enlarged through interpolation, and as a result, the image becomes blurred. - First, a shrinkage size of a smallest shrinkage image is set according to a target size of blurring. For example, the large blurring image has a size corresponding to 1/16 of each side of the input image (the number of pixels is decreased to 1/16 on each of the length and the width). When the input image is shrunk to the size of 1/16 of each side, 1/2 shrinkage is repeated in a vertical direction and a horizontal direction N times (N = 4) in steps S3011 to S3014. Herein, in order to prevent aliasing of a high-frequency component, so-called moire from occurring as a result of the shrinkage, the input image is smoothed by applying a low-pass filter (LPF) with a filter coefficient [1, 2, 1] in the vertical direction and the horizontal direction before the shrinkage (step S3012).
- When the shrinkage processing is repeated up to N times, the enlargement processing is performed until the input image has an original size. The twofold enlargement processing is also repeated in the vertical direction and the horizontal direction N times in steps S3015 to S3017 similarly to the shrinkage processing. In the exemplary embodiment, side magnification in one shrinkage is carried out by a factor of 1/2, but may be carried out by a factor of 1/4 and is not limited thereto. However, the filter coefficient of the low-pass filter used at the same time needs to be appropriately changed in order to prevent the moire from occurring. For example, when the side magnification is decreased to 1/4 the filter coefficient is set to [1, 4, 6, 4, 1].
- Further, the shrinkage size of the smallest shrinkage image for generating the large blurring image needs to be smaller than the shrinkage size of the smallest shrinkage image for generating the small blurring image. In step S301, the blurring image generation processing is performed with N = N1 in the small blurring
image generating unit 1091 and N = N2 (N1 < N2) in the large blurringimage generating unit 1092 in parallel. - LUT selection processing of step S302 of
Fig. 3A will be described in detail with reference to the flowchart ofFig. 3C . As described above, in the exemplary embodiment, the LUT suitable for a shot scene is selected and applied to the processing. For example, in a portrait scene such as an image of a person, gradation correction is performed by anLUT 3 having a characteristic 403 (for example, an output is increased at a darkest input portion by approximately 35%) which increases an output of a pixel value (dark part) having less than a predetermined luminance, to a greater extent as compared with other scenes, as illustrated inFig. 4C . - In a scene of landscape image shooting, gradation correction is performed by an
LUT 2 having a characteristic 402 (for example, an output is increased at the darkest input portion by approximately 13%) which increases a pixel value (dark part) having less than a predetermined luminance, to a lesser extent as compared with other scenes, as illustrated inFig. 4B . - In other general scenes, gradation correction is performed by an
LUT 1 having a characteristic 401 (for example, the output is increased at the darkest input portion by approximately 23%) which increases the output of the dark part to an average extent, as illustrated inFig. 4A . However, broadly speaking, any LUT which has a characteristic that performs gradation correction in which a pixel value as an output value is equal to or more than a pixel value as an input value, may be appropriate. - In the exemplary embodiment, program priority (P), diaphragm priority (Av), time priority (Tv), manual (M) or the like is selectable by a mode dial or the like through an I/
F 122 as an image shooting mode. Further, as an image shooting mode more suitable for a beginner, a categorized image shooting mode considering a kind of a scene, such as a full automatic, portrait mode, a landscape mode, a sport mode, a macro, and a nightscape is installed. - In step S3021, the
control unit 120, for example, reads out an image shooting mode M from thememory 121 as animage shooting condition 1099. In the exemplary embodiment, since an image processing apparatus including an image shooting system is assumed, animage shooting condition 1099 set at the time of shooting the input image is stored in thememory 121 and read out. - Subsequently, in step S3022, the
control unit 120 determines the read-out image shooting mode M and selects the correspondingLUT 1090 from the LUT 1098 for each scene stored in thememory 121. Among the image shooting modes, theLUT 1, theLUT 2, and thestandard LUT 3 are selected in the case of the portrait mode, the landscape mode, and other modes, respectively. The LUT 1098 for each scene is stored in advance, and as a result, calculation processing is reduced in image shooting, and high-speed continuous shooting may be performed without decreasing an image shooting frame speed. In order to select the LUT, the present invention is not limited the particular method described above. For example, image data generated by apparatuses other than the image processing apparatus, which is stored in thememory 121 may be processed. In this case, a header or animage shooting condition 1099 which is stored in association is read out. In particular, when a determinableimage shooting condition 1099 is not stored, theLUT 3 for the standard case is selected. - In step S3023, the
control unit 120 sets the selectedLUT 1090 to thegradation correcting unit 1093 and returns to the main processing. - As described above, in the exemplary embodiment, in the watercolor-like processing for adding a watercolor-like effect, the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image. As a result, the rough line drawing is authentically depicted and image processing for generating the watercolor painting with the color of the input image as a basic color may be implemented.
- Further, in the exemplary embodiment, since gradation correction is performed in a way that is suitable for the relevant image shooting condition, the watercolor-like image may be generated with a way which is appropriate for each scene.
- In the first exemplary embodiment, when the
gradation correcting unit 1093 performs gradation correction, the LUT for each scene is held as a preset, and the LUT is selected and gradation is corrected according to the image shooting condition of the scene, but in the second exemplary embodiment, the scene is determined by analyzing the input image and the gradation-correct LUT is calculated. -
Fig. 5 is a block diagram illustrating the watercolor-like processing unit 109 in detail according to a second exemplary embodiment. Since processing details of a block having the same reference numeral asFig. 1B are the same asFig. 1B , a description of the block will be omitted. The second exemplary embodiment is different from the first exemplary embodiment in that anLUT calculation unit 501 is provided, an input image is analyzed by theLUT calculation unit 501 to determine a scene and anappropriate LUT 502 is calculated which corresponds to the determined scene. -
Fig. 6 is a flowchart illustrating an operating example of the LUT selection processing in step S302 ofFig. 3A according to the exemplary embodiment. Other overall watercolor-like processing is similar to the operation illustrated inFig. 3 . - In step S601, the
control unit 120 and theLUT calculation unit 501 first perform portrait determination processing in order to calculate face reliability which indicates a probability that the scene is a portrait scene. In step S602, thecontrol unit 120 and theLUT calculation unit 501 perform landscape determination processing in order to calculate a landscape degree which indicates a probability that the scene is a landscape scene. - In step S603, an LUT suitable for the input image is calculated by using a determination result in steps S601 and S602. More specifically, when the face reliability is 100%, the LUT illustrated in
Fig. 4C is set. Meanwhile, when the face reliability is 0%, the LUT illustrated inFig. 4B is set. When the face reliability is between them, the face reliability is interpolated and calculated from the LUTs illustrated inFigs. 4B and 4C . - When the landscape degree is 100%, the LUT illustrated in
Fig. 4A is set. Meanwhile, when the landscape degree is 0%, the LUT illustrated inFig. 4B is set. When the landscape degree is between them, the landscape degree is interpolated and calculated from the LUTs illustrated inFigs. 4A and 4B according to the landscape degree. - Further, it is possible that both of the face reliability and the landscape degree are high to some degree, but in this case, the LUT is set by prioritizing the face reliability. The calculated LUT is set in the
gradation correcting unit 1093 in step S3023 similarly as in the first exemplary embodiment. - Next,
Fig. 6B is a flowchart illustrating an operation of the portrait determination processing in step S601. First, face detection is performed by using a means, such as face identification by a Haar-Like feature amount (step S6011). Thereafter, the face reliability is calculated from the feature amount and returns to the main processing (step S6012). A reliable predetermined feature amount is set in advance to represent a face reliability of 100%, and in actual processing, the normalized face reliability using the feature amount is calculated. -
Fig. 6C is a flowchart illustrating an operating example of the landscape determination processing in step S602. A method of identifying a face by analyzing a histogram of a color signal of an input image, or the like may also be used (Fig. 8C ). - In
Fig. 6C , first, RGB of an input signal is converted into signals in HSB color space (step S6021). Subsequently, a hue histogram is calculated in an H component of an input image (step S6022). An example of the histogram calculated as above described is illustrated inFig. 7 , and the histogram is analyzed to determine whether the number of peaks having a threshold (th) or more is one or more (step S6023). When no mountain is detected ("NO" in S6023), the processing ends at the landscape degree of 0% (step S6028). - Meanwhile, when the peak is detected ("YES" in S6023), a variable i is set presuming that the number of all detected peaks is N, in order to perform scene determination with respect to all of the detected peaks (step S6024). An initial value of the variable i is set as "1" and until the variable i is "N", the variable i is incremented by each "1".
- Just after setting the variable i, a green level G for the hue of a first mountain is calculated (step S6025). When a condition of H is in the range GR to GL is met, the green level G is 100% and as the green level G moves away from this condition, the green level G is decreased. In this case, when green level Gi > green level Gi-1, the green level G is updated to Gi.
- Subsequently, a blue level B regarding whether the hue of the mountain is a blue sky is calculated (step S6026). When the condition of H is in the range SL to SR is met, the blue level B is 100% and as the blue level B moves away from this condition, the blue level B is decreased. In this case, when blue level Bi > blue level Bi-1, the blue level B is updated to Bi.
Thereafter, it is determined whether hues of all of the peaks have been examined (step S6027). When the hues of all of the peaks have not been examined ("NO" in S6027), the processing returns to step S6025 in order to examine a hue of a subsequent mountain. When the hues of all of the peaks are examined, the landscape degree L is calculated from the green level G and the blue level B by an equation below and returns to the main processing (S6028). - Further, in addition to the above method, many alternative methods of analyzing and determining the scene of the image have been proposed. While these methods may also be used, a more detailed description will be omitted here.
- As described above, according to the exemplary embodiment, in the watercolor-like processing for adding the watercolor-like effect, the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image. As a result, the rough line drawing is authentically depicted and image processing of generating the watercolor painting with the color of the input image as a basic color may be implemented.
- Further, in the exemplary embodiment, the input image is analyzed to determine the shot scene and the gradation correction is performed with an LUT which is optimized for each scene to retain an optimal dark part of the scene, so that the watercolor-like effect may be accurately produced.
- In the first and second exemplary embodiments, the watercolor-like image is generated by using the extracted edge component as the luminance component. In a third exemplary embodiment, the luminance component is generated apart from the edge component.
-
Fig. 8 is a block diagram of the watercolor-like processing unit 109 implementing watercolor-like processing according to a third exemplary embodiment. Since a block with the same reference numeral asFig. 1B performs the same asFig. 1B , a description of the block will be omitted. - The exemplary embodiment is different from the first and second exemplary embodiments in that luminance data input into an
edge extracting unit 803 is not gradation-corrected. In this point, the edge component output from the edge extracting unit is an almost pure high-frequency component which includes almost no low-frequency component. - Meanwhile, a
gradation correcting unit 801 performs gradation correction so that out of luminance data of an input image, a low-frequency component smaller than predetermined luminance remains. A method of the gradation correction is performed by conversion using a one-dimensional LUT 802 similarly to the first and second exemplary embodiments. In the exemplary embodiment, in order to acquire the effect corresponding to the LUTs referenced in the first and second exemplary embodiments, an LUT which follows a characteristic of 902 ofFig 9 is referenced as theLUT 1, an LUT which follows a characteristic of 903 is referenced as theLUT 2, and an LUT which follows a characteristic of 901 is referenced as anLUT 3. - An
addition unit 804 adds a luminance component (luminance data) and an edge component (edge data) after gradation correction is performed. - The
addition unit 804 will be described in detail with reference toFig. 10 . An offset noise value and a gain value are read out from thememory 121 in advance. First, asubtracter 8041 subtracts a small offsetnoise value 2056 received from edge data, from theedge extracting unit 803. As a result, a small offset noise component can be removed. Since the small noise appears in the bright part after gradation inversion processing is performed and is not appropriate to express light transparence which is a feature of the watercolor-like, the minute noise is desirably removed in this step. - Subsequently, a multiplier 8042 multiplies a
gain value 2057 byedge data 8045 in which the offset noise value is reduced. The intensity of the rough line drawing may be adjusted by the gain value. Thereafter, theedge data 8045 multiplied by the gain value andluminance data 8044 after gradation correction are added to acquire a final luminance component (luminance data). - As described above, in the exemplary embodiment, in the watercolor-like processing of adding the watercolor-like effect, the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image. As a result, the rough line drawing is authentically depicted and image processing of generating the watercolor effect with the color of the input image as a basic color may be implemented.
- Further, in the exemplary embodiment, in generating the luminance component, the edge component and the luminance component are separated from each other, so that an edge amount and a remaining amount of the low-frequency component may be independently adjusted. Meanwhile, a processing overhead increases as compared with the first and second exemplary embodiments. Therefore, it is effective that in a rear mounted liquid crystal monitor, an edge component having a small calculation amount is commonly used together with a luminance component to perform display in a simplified manner. On the other hand, at the time of recording, the edge component and the luminance component may be separated and used separately.
- A fourth exemplary embodiment is a method for acquiring an equivalent appearance (effect) in viewing images of different sizes, even when a plurality of image sizes of a target image subjected to watercolor-like processing is set. In gradation processing of the image, when image sizes (pixel number) of input images are different from each other, a blurring degree seems to be different even in the same shrinkage size. For example, when a recording image recorded in a recording medium and a display image displayed through the
display unit 112 on a back liquid crystal which is a display medium are set different in the shrinkage size, the same effect is not obtained when the same viewing image size is set. Therefore, in the exemplary embodiment, shrinkage sizes required to generate the small blurring image and the large blurring image are set appropriate according to the image size of the input image. -
Fig. 11 is a block diagram of an image processing apparatus according to a fourth exemplary embodiment andFig. 12 is a flowchart of blurring image generation processing according to the fourth exemplary embodiment. As a feature ofFig. 11 , aresize unit 112 is provided at a front side stage of the watercolor-like processing unit 109. Further,Fig. 12 is different from the flowchart of the blurring image generation processing ofFig. 3B according to the first exemplary embodiment in that steps S1201 and S1202 are provided. - In step S1201, an image size of an input image is read out as an
image shooting condition 1099 and in step S1202, the number of shrinkage times N1 in the small blurring image generating unit and the number of shrinkage times N2 in the large blurring image generating unit are set depending on the image size illustrated inFig. 13 . Subsequent processing is performed similarly as in the first exemplary embodiment according to the set N1 and N2. - In a correspondence table of the number of shrinkage times illustrated in
Fig. 13 , one time shrinkage represents 1/2 shrinkage. As the image size is decreased, the number of shrinkage times is reduced. - As described above, in the exemplary embodiment, in the watercolor-like processing of adding the watercolor-like effect, the luminance data including the high-frequency component and the low-luminance low-frequency component is extracted from the luminance data of the input image and the final image data is obtained by adding gradated color data to generate a watercolor-like image. As a result, the rough line drawing is authentically depicted and image processing of generating the watercolor image with the color of the input image as a basic color may be implemented.
- Further, in the exemplary embodiment, the number of shrinkage times when generating the blurring image is set according to the image size of the input image, so that the similar watercolor-like effect can be achieved even in viewing with the same image size a plurality of images having different image sizes.
- In each exemplary embodiment, hardware configuration of each block of the watercolor-
like processing unit 109 has been described. However, since any operation of each block can be implemented by software, some or all of the operations of the watercolor-like processing unit 109 may be installed by software processing. Further, similarly, some or all of operations of other blocks in the image processing apparatus ofFig. 1 may be installed by the software processing. - Further, in each exemplary embodiment, the example has been described in which gradation correction is performed by the one-dimensional LUT in the
gradation correcting units Figs. 4 and9 is performed, the present invention is applicable. For example, an output pixel value may be obtained by a calculation operation. - Further, in each exemplary embodiment as above described, the YUV data divided into the luminance data and the color data, in other words, the data after development processing, is used as an example as the input image for the processing. However, the present invention is not limited thereto and may be applied also to image data including data of each color of RGB. In this case, for example, after conversion into the image data including the luminance data and the color data by a matrix operation, which is performed by the
matrix conversion unit 106, the processing described in each exemplary embodiment may be performed. - Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present invention, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment (s). The computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
Claims (14)
- An image processing apparatus (1), comprising: an extraction means for extracting a high-frequency component and a low-luminance low-frequency component, from luminance data of an input image, the extraction means including:a luminance blurring means (1091) for performing gradation processing to smooth data on the luminance data of the input image;a gradation correcting means (1093) for performing gradation correction processing on the luminance data output from the luminance blurring means;a subtraction means (1094) for subtracting the luminance data of the input image from the luminance data output from the gradation correcting means; andan inversion means (1095) for inverting a pixel value of the luminance data output from the subtraction means;a color blurring means (1092) for performing gradation processing to smooth data on color data of the input image; andan output means (113) for outputting image data in which the output from the extraction means is used as luminance data and the output from the color blurring meansis used as color data, wherein
the gradation degree of the gradation processing performed on the color data of the input image by the color blurring means is higher than the gradation degree of the gradation processing performed on the luminance data of the input image by the luminance blurring means, thus extracting a high-frequency component from the luminance data. - The image processing apparatus (1) according to claim 1, wherein the gradation correcting means performs gradation correction such that an output pixel value is equal to or larger than a corresponding input pixel value.
- The image processing apparatus (1) according to claim 2, wherein the gradation correcting means performs gradation correction such that a gain added to a pixel value of luminance lower than predetermined luminance is higher than a gain added to a pixel value of luminance higher than predetermined luminance.
- The image processing apparatus (1) according to claim 1, wherein the luminance blurring means and the color blurring means change their degree of gradation according to an image size of the input image and make an output image from the output means viewable with an equivalent gradation degree when the output image is viewed with the same image size.
- The image processing apparatus (1) according to claim 1, wherein the luminance blurring means and the color blurring means increase the degree of gradation as the image size of the input image increases.
- The image processing apparatus (1) according to claim 1, wherein the gradation correcting means performs gradation correction based on an image capturing condition in which the input image is captured.
- The image processing apparatus (1) according to claim 1, further comprising: a determination means (1097) for determining a scene type in which the input image is captured,
wherein the gradation correcting means performs gradation correction based on a determination result by the determination means. - The image processing apparatus (1) according to claim 1, wherein the gradation correcting means performs gradation correction such that a gain applied to a pixel value of luminance lower than predetermined luminance is larger when the scene where the input image is captured is determined to be a portrait, than gains in other cases.
- The image processing apparatus (1) according to claim 1, wherein the gradation correcting means performs gradation correction such that a gain applied to a pixel value of luminance lower than predetermined luminance is smaller when the scene where the input image is captured is determined to be a landscape, than gains in other cases.
- The image processing apparatus (1) according to claim 1, wherein the gradation correcting means performs gradation correction on the luminance data output from the luminance blurring means by using a look-up table.
- The image processing apparatus (1) according to claim 1, wherein
the extraction means includes,
a luminance blurring means (1091) for performing gradation processing to smooth data on the luminance data of the input image,
a subtraction means (803) for subtracting the luminance data of the input image from the luminance data output from the luminance blurring means,
a gradation correcting means (801) for performing gradation correction processing such that luminance data of luminance lower than predetermined luminance is extracted from the luminance data of the input image, and
an addition means (804) for adding an output from the subtraction means to an output from the gradation correcting means. - An image processing method, comprising:extracting luminance data including a high-frequency component and a low-luminance low-frequency component, from luminance data of an input image;performing gradation processing (S301) to smooth data on color data of the input image;outputting image data in which luminance data extracted in the extraction step is used as luminance data and color data output in the gradation processing step is used as color data;performing gradation processing (S301) to smooth data on the luminance data of the input image, wherein the gradation degree of the gradation processing performed on the color data of the input image is higher than the gradation degree of the gradation processing performed on the luminance data;performing gradation correction processing (S303) on the luminance data output after gradation processing;subtracting (S304) the luminance data of the input image from the luminance data output after the gradation correction processing;inverting (S305) a pixel value of the luminance data output after the subtraction of luminance data.
- A computer program that when run on a computer causes the computer to carry out the method of claim 12.
- A storage medium storing a program executable by an information processing apparatus, wherein the information processing apparatus executing the program functions as the image processing apparatus (1) according to claim 1.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012130098A JP6004757B2 (en) | 2012-06-07 | 2012-06-07 | Image processing apparatus and image processing method |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2672691A2 EP2672691A2 (en) | 2013-12-11 |
EP2672691A3 EP2672691A3 (en) | 2015-11-18 |
EP2672691B1 true EP2672691B1 (en) | 2018-11-07 |
Family
ID=48700274
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13170923.0A Active EP2672691B1 (en) | 2012-06-07 | 2013-06-06 | Image processing apparatus and image processing method |
Country Status (3)
Country | Link |
---|---|
US (1) | US9177396B2 (en) |
EP (1) | EP2672691B1 (en) |
JP (1) | JP6004757B2 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014155005A (en) * | 2013-02-07 | 2014-08-25 | Canon Inc | Display apparatus and control method of the same |
US10008011B1 (en) | 2014-11-26 | 2018-06-26 | John Balestrieri | Methods for creating a simulated watercolor-painted image from a source image |
JP6516510B2 (en) * | 2015-03-02 | 2019-05-22 | キヤノン株式会社 | Image processing apparatus, imaging apparatus, image processing method, image processing program, and storage medium |
CN108460825A (en) * | 2018-03-15 | 2018-08-28 | 中辰远瞻(北京)照明设计有限公司 | A kind of nightscape lighting scheme works figure production method |
US10853921B2 (en) | 2019-02-01 | 2020-12-01 | Samsung Electronics Co., Ltd | Method and apparatus for image sharpening using edge-preserving filters |
JP7469738B2 (en) | 2020-03-30 | 2024-04-17 | ブラザー工業株式会社 | Trained machine learning model, image generation device, and method for training machine learning model |
US20230336879A1 (en) * | 2020-08-24 | 2023-10-19 | Google Llc | Lookup table processing and programming for camera image signal processing |
JP2022150650A (en) * | 2021-03-26 | 2022-10-07 | キヤノン株式会社 | Printing control device, printing control method and program |
CN115423716B (en) * | 2022-09-05 | 2024-04-26 | 深圳市新弘途科技有限公司 | Image enhancement method, device, equipment and storage medium based on multidimensional filtering |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5063448A (en) * | 1989-07-31 | 1991-11-05 | Imageware Research And Development Inc. | Apparatus and method for transforming a digitized signal of an image |
JPH05336373A (en) * | 1992-06-04 | 1993-12-17 | Toshiba Corp | Image recorder |
US6456655B1 (en) * | 1994-09-30 | 2002-09-24 | Canon Kabushiki Kaisha | Image encoding using activity discrimination and color detection to control quantizing characteristics |
JP3700871B2 (en) * | 1995-11-11 | 2005-09-28 | ソニー株式会社 | Image converter |
US6727906B2 (en) * | 1997-08-29 | 2004-04-27 | Canon Kabushiki Kaisha | Methods and apparatus for generating images |
US6804392B1 (en) * | 2000-10-16 | 2004-10-12 | Eastman Kodak Company | Removing color aliasing artifacts from color digital images |
JP2004046583A (en) * | 2002-07-12 | 2004-02-12 | Ricoh Co Ltd | Image processing apparatus, method, program, and recording medium |
JP2008242533A (en) | 2007-03-24 | 2008-10-09 | Univ Of Fukui | Coloring drawing preparation device and method and program therefor |
JP5326943B2 (en) * | 2009-08-31 | 2013-10-30 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
JP5589660B2 (en) * | 2010-08-12 | 2014-09-17 | 株式会社ニコン | Image processing apparatus, imaging apparatus, and image processing program |
JP5229360B2 (en) * | 2010-09-30 | 2013-07-03 | カシオ計算機株式会社 | Image processing apparatus, image data conversion method, print order receiving apparatus, program |
-
2012
- 2012-06-07 JP JP2012130098A patent/JP6004757B2/en not_active Expired - Fee Related
-
2013
- 2013-06-04 US US13/909,434 patent/US9177396B2/en not_active Expired - Fee Related
- 2013-06-06 EP EP13170923.0A patent/EP2672691B1/en active Active
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
US20130329991A1 (en) | 2013-12-12 |
US9177396B2 (en) | 2015-11-03 |
JP2013254390A (en) | 2013-12-19 |
EP2672691A2 (en) | 2013-12-11 |
EP2672691A3 (en) | 2015-11-18 |
JP6004757B2 (en) | 2016-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2672691B1 (en) | Image processing apparatus and image processing method | |
US7023580B2 (en) | System and method for digital image tone mapping using an adaptive sigmoidal function based on perceptual preference guidelines | |
JP6415062B2 (en) | Image processing apparatus, image processing method, control program, and recording medium | |
EP2624204B1 (en) | Image processing apparatus and method of controlling the same | |
EP1930853A1 (en) | Image signal processing apparatus and image signal processing | |
JP4666179B2 (en) | Image processing method and image processing apparatus | |
JP5223742B2 (en) | Edge-enhanced image processing apparatus | |
WO2015012040A1 (en) | Image processing device, image capture device, image processing method, and program | |
JP7117915B2 (en) | Image processing device, control method, and program | |
GB2549696A (en) | Image processing method and apparatus, integrated circuitry and recording medium | |
KR20120016475A (en) | Image processing method and image processing apparatus | |
JP2012165204A (en) | Signal processing apparatus, signal processing method, imaging apparatus, and imaging processing method | |
US20160249029A1 (en) | Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable medium | |
US10051252B1 (en) | Method of decaying chrominance in images | |
JP2018014646A (en) | Image processing apparatus and image processing method | |
US9635331B2 (en) | Image processing apparatus that performs tone correction and edge enhancement, control method therefor, and storage medium | |
CN106575434A (en) | Image processing device, image capturing device, image processing method, and program | |
JP5202190B2 (en) | Image processing method and image processing apparatus | |
JP2008305122A (en) | Image-processing apparatus, image processing method and program | |
Brown | Color processing for digital cameras | |
Adams et al. | Perceptually based image processing algorithm design | |
GB2581434A (en) | Image processing apparatus, image processing method, program, and storage medium | |
JP2020136928A (en) | Image processing apparatus, image processing method, and program | |
US9055232B2 (en) | Image processing apparatus capable of adding soft focus effects, image processing method, and storage medium | |
JP2012178191A (en) | Image processing apparatus and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 1/60 20060101ALI20151001BHEP Ipc: H04N 1/40 20060101AFI20151001BHEP |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 1/60 20060101ALI20151009BHEP Ipc: H04N 1/40 20060101AFI20151009BHEP |
|
17P | Request for examination filed |
Effective date: 20160518 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20170523 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20180620 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 1063491 Country of ref document: AT Kind code of ref document: T Effective date: 20181115 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013046213 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20181107 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1063491 Country of ref document: AT Kind code of ref document: T Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190207 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190207 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190307 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190307 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190208 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013046213 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20190808 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602013046213 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20190630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200101 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190630 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190630 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190606 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130606 Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181107 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230523 Year of fee payment: 11 |