WO2012099013A1 - Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image, programme et support d'enregistrement - Google Patents

Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image, programme et support d'enregistrement Download PDF

Info

Publication number
WO2012099013A1
WO2012099013A1 PCT/JP2012/050582 JP2012050582W WO2012099013A1 WO 2012099013 A1 WO2012099013 A1 WO 2012099013A1 JP 2012050582 W JP2012050582 W JP 2012050582W WO 2012099013 A1 WO2012099013 A1 WO 2012099013A1
Authority
WO
WIPO (PCT)
Prior art keywords
correction
target area
data
signal
color
Prior art date
Application number
PCT/JP2012/050582
Other languages
English (en)
Japanese (ja)
Inventor
張 小▲忙▼
上野 雅史
宮田 英利
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2012099013A1 publication Critical patent/WO2012099013A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/643Hue control means, e.g. flesh tone control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only
    • H04N1/628Memory colours, e.g. skin or sky

Definitions

  • the present invention relates to an image correction apparatus that corrects image data, an image correction display apparatus, and an image correction method.
  • Japanese Patent Application Laid-Open No. 2004-228561 discloses a technique relating to an imaging device that corrects the hue of a color information signal corresponding to a human face area out of color information signals of an image based on a calculated hue correction value. Is disclosed.
  • FIG. 13 shows the main components of the imaging device disclosed in Patent Document 1.
  • FIG. 13 is a block diagram showing main components of the imaging device 90 disclosed in Patent Document 1.
  • the imaging device 90 includes a face area extraction unit 91 that extracts a color information signal corresponding to a person's face area, and a hue correction value that corrects the hue of the color information signal corresponding to the person's face area.
  • a hue correction value calculation unit 92 to be calculated, and a hue correction unit 93 to correct the hue of the color information signal corresponding to the face area based on the hue correction value are provided.
  • Japanese Patent Publication Japanese Patent Laid-Open No. 2004-180114 (published June 24, 2004)”
  • the area corresponding to the color information signal corresponding to the skin color in the image occupies a predetermined range with respect to the entire image, and the aspect ratio of the range
  • an area having a predetermined ratio (for example, length: width ⁇ 1: 1) is determined as a face area. For this reason, it is difficult to extract only the face area by the method of discriminating the area that satisfies the above conditions.
  • an area other than an area that actually includes a face among areas corresponding to a color information signal corresponding to skin color may be determined as a face area.
  • the present invention has been made to solve the above-described problems, and its main purpose is to extract data indicating a target area to be corrected from image data and perform correction suitable for the data indicating the extracted target area.
  • An object of the present invention is to provide an image correction apparatus for performing.
  • an image correction apparatus refers to an input image stored in a storage unit by referring to a target feature database including feature data indicating features specific to a correction target region.
  • a target area extracting means for extracting target area data including specific features from the data
  • a correction content database stored in a storage unit and defined as correction contents for the target area data
  • the target area data when the target area data is extracted from the input image data, data corresponding to feature data indicating characteristics specific to the correction target area included in the target feature database is used as the target area data. Can be extracted.
  • input image data applicable to the feature data for example, when a face area is extracted as the target area data, the same color as the partial area of the body or the face of the same color as the face area It is possible to prevent erroneous detection of the wall area of the camera.
  • An image correction method of an image correction apparatus is an image correction method of an image correction apparatus that corrects an input image, and is feature data that is stored in a storage unit and indicates characteristics specific to a correction target region.
  • a target region extraction step for extracting target region data including specific features from input image data, and correction contents for the target region data stored in the storage unit are determined.
  • a correction processing step of performing image correction on the target area data extracted in the target area extraction step in the input image data by referring to the correction content database.
  • the input image data corresponding to the feature data indicating the characteristic specific to the correction target area included in the target feature database is converted into the target area. It can be extracted as data. As a result, the same effects as those of the image correction apparatus described above can be obtained.
  • an image correction apparatus refers to an input image stored in a storage unit by referring to a target feature database including feature data indicating features specific to a correction target region.
  • a target area extracting means for extracting target area data including specific features from the data
  • a correction content database stored in a storage unit and defined as correction contents for the target area data
  • target area data to be corrected is extracted from the input image data, and correction suitable for the extracted target area data can be performed.
  • FIG. 4 is a flowchart illustrating an example of a flow of correction target area extraction processing in a target area extraction unit of the image correction apparatus illustrated in FIG. 2.
  • 3 is a flowchart illustrating an example of a flow of image correction processing in a correction processing unit of the image correction apparatus illustrated in FIG. 2.
  • FIG. 4 is a diagram showing color difference correction for hue information of face area data in the CbCr coordinate system in the color difference correction processing unit of the image correction apparatus shown in FIG. 2, and (a) shows the range of skin color hue information of the face area data in CbCr coordinates; (B) is a diagram showing a range of hue information before and after correcting the skin color of the face area data in the CbCr coordinate system.
  • It is a block diagram which shows the detail of a structure of the image correction apparatus which concerns on the modification of one Embodiment of this invention. It is a block diagram which shows the detail of a structure of the image correction apparatus which concerns on the other modification of one Embodiment of this invention.
  • FIG. 10 is a block diagram illustrating main components of a face color correction device disclosed in Patent Document 1.
  • FIG. 1 is a block diagram illustrating a configuration of an image correction apparatus 1 according to the present embodiment.
  • FIG. 2 is a block diagram showing details of the configuration of the image correction apparatus 1 according to the present embodiment.
  • the image correction device 1 is mounted on, for example, a television receiver or an information processing device, and corrects the image quality of input image data included in a broadcast signal or an output signal of the image output device.
  • the image correction apparatus 1 includes a target region extraction unit 10 (target region extraction unit), a correction processing unit 20 (correction processing unit), and a storage unit 30.
  • the storage unit 30 stores a target feature database 31 and a correction content database 32.
  • the image correction apparatus 1 further includes an RGB conversion unit 40 (first color space signal conversion means).
  • the correction processing unit 20 of the image correction apparatus 1 includes a luminance correction processing unit 21 (luminance correction processing unit), a color difference correction processing unit 22 (color difference correction processing unit), and a noise reduction processing unit 23 (noise reduction processing unit). It is comprised including.
  • the target area extraction unit 10 is means for extracting correction target area data (target area data) indicating a target to be corrected from input image data.
  • the target feature database 31 stored in the storage unit 30 is referred to, and data having a value within a range value predetermined in the target feature database 31 (features indicating features unique to the correction target region). Data) is extracted as correction target area data.
  • the target feature database 31 will be described later.
  • the correction processing unit 20 is means for performing image quality correction processing (hereinafter also referred to as target region correction processing) on the correction target region data. Specifically, the correction processing unit 20 stores, in the storage unit 30, the correction content most suitable for the target indicated by the correction target region data with respect to the correction target region data extracted by the target region extraction unit 10. The target area correction process is executed based on the determined content with reference to the correction content database 32 being determined.
  • the target area correction processing is, for example, a luminance signal including luminance information (brightness value) indicating the degree of brightness of the correction target area data included in the correction target area data, and the color of the correction target area data. It means that at least one of the color difference signals quantitatively indicating the perceptual difference is adjusted. Further, the color difference signal includes hue information (hue value) indicating the hue of the correction target area data and the attribute of the color to be characterized, and saturation information (saturation value) indicating the degree of vividness of the correction target area data. include.
  • the correction content database 32 will be described later.
  • the luminance correction processing unit 21 is provided in the correction processing unit 20 and executes a correction process on the luminance information included in the luminance signal of the correction target region data extracted by the target region extraction unit 10. Specifically, the luminance correction processing unit 21 corrects the luminance information in the correction target area data, thereby correcting a thick outline (for example, a thick edge such as a face outline), a thin and sharp outline. For example, fine outline emphasis correction for emphasizing (for example, thin and sharp edges such as eyelashes) and texture emphasis correction for emphasizing texture (for example, thin edges such as lawn and brick) are performed.
  • a thick outline for example, a thick edge such as a face outline
  • texture emphasis correction for emphasizing texture
  • texture for example, thin edges such as lawn and brick
  • the luminance correction processing unit 21 preferably includes a band pass filter and a high pass filter.
  • the luminance correction processing unit 21 performs thick contour emphasis correction using a bandpass filter, performs fine contour emphasis correction using a bandpass filter having a higher pass frequency band than the bandpass filter used for thick contour emphasis correction, Texture enhancement correction may be performed using a high-pass filter.
  • the configuration of the luminance correction processing unit 21 is not limited to this.
  • the chrominance correction processing unit 22 is provided in the correction processing unit 20 and is chromaticity indicating hue information and chromaticity information indicating hue information included in the chrominance signal of the correction target region data extracted by the target region extraction unit 10. Execute correction processing for information. Specifically, the color difference correction processing unit 22 performs a hue correction process on the hue information of the correction target area data, and performs a saturation correction process on the saturation information of the correction target area data.
  • the value of the hue information included in the correction target area data is corrected to a value within a predetermined appropriate range.
  • the hue correction process include, but are not limited to, a skin color hue correction process for correcting the facial hue within an appropriate range, a blue sky hue correction process, and a lawn hue correction process. It is not a thing.
  • the saturation correction process is performed by multiplying the color difference signal by a positive coefficient.
  • the coefficient is larger than 1, the color is corrected so as to be vivid, and when the coefficient is smaller than 1, the color is corrected so as to become light.
  • saturation correction is not performed.
  • r is the saturation and ⁇ is the hue.
  • the noise reduction processing unit 23 removes noise in the luminance signal and the color difference signal included in the correction target area data. Specifically, the noise reduction processing unit 23 removes noise (flickering, roughness, etc.) in the luminance signal and removes noise (excess color information, etc.) in the color difference signal, thereby removing the noise in the correction target area data. Remove.
  • the noise reduction processing unit 23 preferably includes a low-pass filter or a median filter.
  • the median filter arranges the density values in the respective pixels of the mask having a mask size of n ⁇ n (n is a natural number) (for example, 3 ⁇ 3 or 5 ⁇ 5) in ascending order,
  • the noise is removed by setting the output density of the target pixel.
  • the median filter has a larger noise removal effect as the mask size is larger.
  • the low-pass filter sets a coefficient for each pixel of the mask of the mask size n ⁇ n so that the sum of the coefficients becomes 1 and the coefficient of the target pixel becomes the maximum, and weighted averaging processing is performed.
  • the noise By removing the noise.
  • the coefficients are uniform, the noise removal effect is maximized. If the coefficient of the pixel of interest is 1, other coefficients are 0, and the noise removal effect is lost.
  • a filter coefficient it is sufficient to use a filter coefficient with a large noise removal effect for a target area that is flat and has few edges, such as a face and a blue sky. For a target region with a large amount of noise, a filter coefficient having a small noise removal effect may be used.
  • the noise reduction processing unit 23 when the noise reduction processing unit 23 is a low-pass filter, it is possible to perform correction by weighted average processing on the correction target region data, and when the noise reduction processing unit 23 is a median filter. Correction that removes fine noise can be performed.
  • the storage unit 30 includes a target feature database 31 that is referred to when the target region extraction unit 10 extracts correction target region data, and a correction that is referred to when the target region correction process is executed in the correction processing unit 20.
  • the storage unit 30 is a program for operating a computer as the image correction apparatus 1, and serves as a computer-readable recording medium that records a program that causes the computer to function as each unit of the image correction apparatus 1. Also bears.
  • the RGB conversion unit 40 converts the color space signal of the color system indicating the input image data corrected by the correction processing unit 20 into a color space signal of another color system and outputs it as output image data.
  • the RGB conversion unit 40 converts the color space signal (Y, Cb, Cr) of the input image data input in the color system represented by the YCbCr color space into the color space signal (R, G) of the RGB color system. , B).
  • the RGB conversion unit 40 will be described by taking as an example a configuration for converting a color space signal of the color system expressed in the YCbCr color space into a color space signal of the RGB color system.
  • the present invention is not limited to this.
  • a configuration for converting to a color space signal of the CIE L * a * b * color system may be adopted.
  • FIG. 3 is a diagram showing an example of the target feature database 31 in the present embodiment.
  • FIG. 4 is a diagram showing an example of the correction content database 32 in the present embodiment.
  • the target feature database 31 extracts, for example, face feature information and a blue sky that are referred to when extracting a human face as feature information of a region extracted as correction target region data from input image data.
  • This is a database in which a plurality of feature information, such as blue sky feature information referred to when performing grass extraction, and lawn feature information referred to when extracting grass, is set in advance.
  • the facial feature information for example, a range value value of hue information indicating the skin color of the face, a range value value of saturation information indicating the saturation of the face, a range value of value of luminance information indicating the brightness of the face, and The range value of the relative distance relationship of facial features such as eyes, nose and mouth is set.
  • the blue sky feature information for example, the range value of the hue information, the range value of the saturation information value, the range value of the luminance information value, the range value of the high frequency component, and the range of the position on the screen Value is set.
  • the range value of the high frequency component in the blue sky feature information includes, for example, a threshold value indicating that the ratio of the high frequency component included is below a certain level (for example, a high frequency component other than the high frequency component caused by noise) A small value that indicates that the position is not set), and the range value of the position on the screen is a value of a predetermined position that specifies that the position is higher than the predetermined position on the screen Should be set.
  • the range value of the hue information value, the range value of the saturation information value, the range value of the luminance information value, the range value of the high frequency component, and the range of the position on the screen Value is set.
  • the range value of the high frequency component in the lawn feature information is set with a threshold value indicating that the ratio of the high frequency component included is equal to or greater than a certain value, and the range value of the position on the screen includes the screen It is only necessary to set a value of a predetermined position that specifies that the position is lower than the predetermined position above.
  • the correction content included in the correction content database includes luminance correction content indicating correction content for luminance information, color difference correction content indicating correction content for color difference information, and noise reduction content indicating correction content for both luminance information and color difference information. Includes at least one of them.
  • the correction content database 32 is referred to when correcting the face correction content, which is referred to when correcting a person's face in the correction target region data, and when correcting the blue sky.
  • Correction content information such as the blue sky correction content and the lawn correction content referred to when correcting the lawn is predetermined.
  • hue correction processing is performed on hue information indicating parts other than the eyes, nose, and mouth, the hue is corrected to an appropriate range of standard skin color (for example, compression), and hue information indicating the lip part is referred to Select the color that corrects the hue information indicating the lip from several predetermined colors, perform hue correction, perform edge enhancement on eyes, nose and mouth, and edge enhancement on eyelashes Etc. are set.
  • standard skin color for example, compression
  • the blue sky correction content is set such that the hue information indicating the blue sky is corrected to the appropriate range of the standard blue, and the noise reduction is performed on the hue information indicating the blue sky.
  • the lawn correction contents are set such that the hue information indicating the lawn is corrected to a proper range of standard green, and the high frequency component is emphasized for the hue information indicating the lawn.
  • the correction for the luminance information, the correction for the hue information and the saturation information, and the correction for the luminance information, the hue information and the saturation information are respectively performed as necessary on the characteristics of the correction target region. And natural correction can be performed on the input image data.
  • correction processing that is unnecessary or incompatible due to the characteristics of the correction target region is not corrected by the corresponding processing unit, and the signal is passed through. You can do it.
  • the target area extraction unit 10 and the correction processing unit 20 can perform processing based on the luminance information and the color difference information, the luminance signal and the color difference signal constituting the input image data are represented by the luminance signal and the color difference signal. It is possible to perform processing without converting to a color space signal of a color system different from the displayed color system.
  • the luminance signal and the color difference signal including the luminance information and the color difference information processed by the correction processing unit 20 are different from the color system represented by the luminance signal and the color difference signal. After being converted to a color space signal, it can be output as output image data.
  • FIG. 5 is a flowchart illustrating an example of the flow of the correction target area extraction process in the target area extraction unit 10 included in the image correction apparatus 1 according to the present embodiment.
  • the target region extraction unit 10 acquires the face feature information, blue sky feature information, and lawn feature information shown in FIG. 3 as feature information from the target feature database 31 as an example.
  • the feature information acquired by the target region extraction unit 10 in the present invention is not limited to this.
  • the target region extraction unit 10 may adopt a configuration in which which feature information is acquired can be set in advance by a setting unit (not shown).
  • the target area extracting unit 10 acquires feature information defined in the target feature database 31 stored in the storage unit 30 (step S1).
  • the target region extraction unit 10 compares the luminance information, the hue information, and the saturation information included in the luminance signal and the color difference signal of the supplied input image data with the feature information (step S2).
  • the target region extraction unit 10 determines that the region data indicated by the luminance information, hue information, and saturation information included in the luminance signal and the color difference signal of the input image data has a value within each range value of the facial feature information. It is determined whether the data is possessed (step S3).
  • the target area extraction unit 10 determines that the area data indicated by the luminance information, hue information, and saturation information in the input image data is data having a value within each range value of the face feature information (in step S3). YES) It is determined whether or not the eye, nose and mouth features can be extracted from the area data (step S4).
  • the target region extraction unit 10 determines that the relative distance relationship between the eyes, nose, and mouth is within the range of the face feature information. It is determined whether it is a value (step S5).
  • the target region extraction unit 10 determines the luminance information, hue information, and saturation information. Region data having a value within each range value of the face feature information from the input image data indicated by, and the region data in which the relative distance relationship between the eyes, nose and mouth is a value within the range value of the face feature information Are extracted as face area data to be corrected (step S6).
  • the target area extraction unit 10 determines that the feature information of the eyes, nose, and mouth cannot be extracted (NO in step S4), the relative distance relationship between the eyes, nose, and mouth is not within the range value of the face feature information. If it is determined (NO in step S5), or whether the luminance information, hue information, and saturation information of the supplied input image data have been compared with all feature information after extracting face area data to be corrected Is determined (step S7).
  • step S7 If it is determined that the luminance information, hue information, and saturation information of the supplied input image data are not compared with all the feature information (NO in step S7), the target area extraction unit 10 again performs the processing from step S2. repeat.
  • the target area extraction unit 10 determines that the area data indicated by the luminance information, the hue information, and the saturation information is not data having a value within each range value of the face feature information in the input image data (NO in step S3). ) In the region, it is determined whether or not the region data indicated by the luminance information, hue information, and saturation information is data that is a value within each range value of the blue sky feature information (step S8).
  • the target area extraction unit 10 determines that the area data indicated by the luminance information, the hue information, and the saturation information in the input image data is data having a value within each range value of the blue sky feature information (in step S8). YES) In the area data, it is determined whether or not a high frequency component is included in the area data that is a value within each range value of the blue sky feature information (step S9). For example, the ratio value of the high frequency component included in the area data is compared with the threshold value set in the range value of the high frequency component in the blue sky feature information, and the ratio of the high frequency component included in the area data is equal to or less than the threshold value. In this case, it may be determined that the high frequency component is not included.
  • the target region extraction unit 10 determines that the region data determined not to include the high frequency component is above the image represented by the input image data. It is determined whether or not the data has a value within a range value indicating that it is located at (step S10). For example, the target area extraction unit 10 determines whether or not a predetermined percentage of the area data determined to contain no high-frequency components is above the center of the image represented by the input image data. The determination in step S10 may be performed by determining the above.
  • the target area extraction unit 10 is such that the area data indicated by the luminance information, the hue information, and the saturation information is a value within each range value of the blue sky feature information, does not include a high-frequency component, and is input image data
  • the area data having a value within the range value indicating that it is located above the image represented by is extracted as the blue sky area data to be corrected (step S11).
  • the target region extraction unit 10 is not region data having a value within a range value indicating that it is located above the image represented by the input image data. (NO in step S10), or after extracting the blue sky region data to be corrected, whether the luminance information, hue information and saturation information of the supplied input image data are compared with all feature information It is determined again whether or not (step S7).
  • step S7 If it is determined that the luminance information, hue information, and saturation information included in the luminance signal and the color difference signal of the supplied input image data are not compared with all the feature information (NO in step S7), the target region extraction unit 10 Repeats the process from step S2 again.
  • the target area extraction unit 10 is an area in which the luminance information, the hue information, and the saturation information are determined not to be within the range values of the face feature information in the input image data (NO in step S3), and each of the blue sky feature information Whether or not the area data indicated by the luminance information, the hue information, and the saturation information is data that is a value within each range value of the lawn feature information in the area determined not to be within the range value (NO in step S8). Is determined (step S12).
  • the target region extraction unit 10 determines that the region data indicated by the luminance information, hue information, and saturation information in the input image data is data having a value within each range value of the lawn feature information (in step S12). YES) In the area data, it is determined whether or not a high frequency component is included in the area data within each range value of the lawn feature information (step S13). For example, the ratio value of the high frequency component included in the area data is compared with the threshold value set for the range value of the high frequency component in the lawn feature information, and the ratio of the high frequency component included in the area data is equal to or greater than the threshold value. If it is, it may be determined that a high frequency component is included.
  • the target region extraction unit 10 When it is determined that a high frequency component is included (YES in step S13), the target region extraction unit 10 has a region data determined to include a high frequency component below the image represented by the input image data. It is determined whether or not the data has a value within a range value indicating that it is located at (step S14). For example, the target area extraction unit 10 determines whether or not a predetermined percentage of the area data determined to include a high frequency component is below the center of the image represented by the input image data. By determining the above, the determination in step S14 may be performed.
  • the target region extraction unit 10 is a region data in which the region data indicated by the luminance information, the hue information, and the saturation information has a value within each range value of the lawn feature information, and includes high frequency components. And region data having a value within a range value indicating that the input image data is located below the image represented by the input image data is extracted as lawn region data to be corrected (step S15).
  • the target area extraction unit 10 includes, in the input image data, area data that is determined that the area data indicated by the luminance information, the hue information, and the saturation information has data within each range value of the lawn feature information. If not (YES in step S12), if it is determined that a high frequency component is not included (NO in step S13), a value within a range value indicating that the input image data is positioned below the image is displayed. If it is determined that the region data is not included (NO in step S14), or after extracting the lawn region data to be corrected, luminance information, hue information, and saturation information of the supplied input image data are all feature information. Is again determined (step S7).
  • step S7 If it is determined that the luminance information, hue information, and saturation information included in the luminance signal and the color difference signal of the supplied input image data are not compared with all the feature information (NO in step S7), the target region extraction unit 10 Repeats the process from step S2 again.
  • the target region extraction unit 10 Ends the correction target area extraction processing.
  • luminance information, hue information, and saturation information are first compared with face feature information, then compared with blue sky feature information, and further compared with lawn feature information.
  • the order of comparison is not limited to this, and the order may be changed as appropriate.
  • FIG. 6 is a flowchart illustrating an example of the flow of target area correction processing in the correction processing unit 20 of the image correction apparatus 1 according to the present embodiment.
  • a target area correction process performed on the correction target face area data extracted by the target area extraction unit 10 will be described as an example.
  • the correction processing unit 20 acquires correction content information defined in the correction content database 32 stored in the storage unit 30 (step S21). .
  • the correction processing unit 20 supplies information regarding correction for luminance information and correction target area data among the acquired correction content information to the luminance correction processing unit 21. In addition, the correction processing unit 20 supplies information regarding correction for hue information and saturation information and correction target area data among the acquired correction content information to the color difference correction processing unit 22. Further, the correction processing unit 20 supplies information regarding noise removal (noise reduction) and correction target area data among the acquired correction content information to the noise reduction processing unit 23.
  • the luminance correction processing unit 21 performs luminance correction processing on the luminance information included in the correction target region data extracted by the target region extraction unit 10 (step S22). For example, when the correction target area data is face area data, the luminance correction processing unit 21 refers to the face correction content information, performs thick outline emphasis correction that emphasizes thick outlines of eyes, nose, and mouth, and narrows the eyelashes. Performs fine contour emphasis correction that emphasizes steep contours. Note that the luminance correction processing unit 21 does not have to perform correction when the correction content for the luminance information is not defined in the correction content information. The luminance correction processing unit 21 supplies the input image data to the noise reduction processing unit 23 regardless of whether or not the luminance correction processing has been performed.
  • the color difference correction processing unit 22 performs a hue correction for correcting the hue information included in the correction target region data extracted by the target region extraction unit 10 (step S23).
  • the color difference correction processing unit 22 refers to the face correction content information and performs the correction calculation using the above-described equation (1), whereby the eyes, nose, and mouth. Correct the hue of the part other than to the appropriate range of the standard skin color.
  • the color difference correction processing unit 22 refers to the face correction content information, and refers to the lip portion with reference to the lip color of the original image.
  • the hue correction is performed by selecting and applying the most natural lip color from the colors.
  • the color difference correction processing unit 22 does not need to perform correction when the correction content for the hue information is not defined in the correction content information.
  • FIG. 7 is a diagram showing color difference correction for the hue information of the face area data in the CbCr coordinate system.
  • FIG. 7A is a diagram showing the range of skin color hue information of the face area data in the CbCr coordinate system
  • FIG. 7B is a diagram before and after correcting the skin color of the face area data in the CbCr coordinate system. It is a figure which shows the range of the hue information after performing.
  • the hue ⁇ shown in the hue information of the face area data is a value in the range from ⁇ 1 ⁇ 1 to ⁇ 1 + ⁇ 2 with ⁇ 1 as the center
  • the saturation shown in the saturation information r is a value within the range from r1 to r2.
  • the intersection of ⁇ 1 + ⁇ 2 and r1 is a
  • the intersection of ⁇ 1 ⁇ 1 and r1 is b
  • the intersection of ⁇ 1 ⁇ 1 and r2 is c
  • ⁇ 1 + ⁇ 2 and r2 When the intersection point with d is d, the value of the skin color hue information of the face area data is a value in the range abcd.
  • the color difference correction processing unit 22 corrects the value of the hue ⁇ from ⁇ 1 ⁇ 1 ′ to ⁇ 1 + ⁇ 2 ′ centering on ⁇ 1, as shown in FIG. (However, ⁇ 1 ⁇ ⁇ 1 ′ and ⁇ 2 ⁇ ⁇ 2 ′). That is, as shown in FIG. 7B, the color difference correction processing unit 22 sets the intersection of ⁇ 1 + ⁇ 2 ′ and r1 as e, sets the intersection of ⁇ 1 ⁇ 1 ′ and r1 as f, and sets ⁇ 1 ⁇ 1 ′ and r2.
  • the hue value can be corrected within the range of the coordinate area where the input image data becomes a natural image in the CbCr coordinate system.
  • the color difference correction processing unit 22 performs saturation correction on the saturation information included in the correction target region data extracted by the target region extraction unit 10 (step S24). For example, when the correction target area data is face area data, the color difference correction processing unit 22 performs correction with reference to the face correction content information. Note that the color difference correction processing unit 22 may not perform correction when the correction content for the saturation information is not defined.
  • the noise reduction processing unit 23 performs noise reduction that is correction for removing noise in luminance information and color difference information included in the correction target region data extracted by the target region extraction unit 10 (step S25). For example, when the correction target area data is face area data, the noise reduction processing unit 23 performs correction by referring to the face correction content information. In addition, the noise reduction process part 23 does not need to perform correction
  • the correction processing unit 20 determines whether or not the target region correction processing has been performed on all the correction target region data extracted by the target region extraction unit 10 (step S26).
  • step S26 If it is determined that the target area correction process has not been performed on all the correction target area data extracted by the target area extraction unit 10 (NO in step S26), the correction processing unit 20 has not performed the target area correction process.
  • the processing from step S22 to S26 is executed for the correction target area data. That is, the correction processing unit 20 repeats steps S22 to S26 until it is determined that the target region correction processing has been performed on all the correction target region data extracted by the target region extraction unit 10.
  • the correction processing unit 20 ends the target area correction process.
  • the correction processing unit 20 when the correction processing unit 20 extracts the correction target area data from the input image data, the data corresponding to the feature data indicating the characteristic specific to the correction target area included in the target feature database 31 is extracted. It can be extracted as correction target area data. Accordingly, in order to extract the input image data applicable to the feature data as the correction target area data, for example, when extracting a face area as the correction target area data, a part of the body of the same color as the face area or a face It is possible to prevent erroneous detection of the same color wall area.
  • the correction processing unit 20 can perform correction based on the correction content included in the correction content database 32 on the extracted correction target region data.
  • the content of the correction may be incompatible with the area other than the correction target area while being compatible with the characteristics of the correction target area. Accordingly, inadequate correction is not performed on data other than the correction target area data, so that the same correction is uniformly performed on the entire input image data, and an unnatural output image can be prevented.
  • the image correction apparatus 1 can extract the correction target area data to be corrected from the input image data, and can perform correction suitable for the extracted correction target area data.
  • a device such as a television receiver or an information processing device in which the image correction apparatus 1 according to this embodiment is mounted can display input image data corrected by the image correction apparatus 1, a more natural image can be displayed.
  • the input image data corrected to become can be displayed.
  • FIG. 8 is a block diagram showing details of the configuration of the image correction apparatus 1a according to this modification.
  • the configuration is the same as that of the image correction apparatus 1.
  • the correction processing unit 20a is means for performing target area correction processing on the correction target area data. Specifically, with respect to the correction target area data extracted by the target area extraction unit 10, the correction content database 32 stored in the storage unit 30 stores the correction content most suitable for the target indicated by the correction target area data. The target area correction processing is executed based on the determined content with reference to the determined contents.
  • the correction processing unit 20a includes a luminance correction processing unit 21a, a color difference correction processing unit 22a, and a noise reduction processing unit 23a.
  • the luminance correction processing unit 21a is provided in the correction processing unit 20a, and executes correction processing on luminance information included in the luminance signal of the correction target region data extracted by the target region extraction unit 10 from the input image data.
  • the luminance correction processing unit 21a supplies the input image data that has been subjected to the correction process on the correction target region data to the RGB conversion unit 40a.
  • the color difference correction processing unit 22a is provided in the correction processing unit 20a, and executes correction processing on hue information and saturation information included in the color difference signal of the correction target region data extracted by the target region extraction unit 10 from the input image data. To do. Further, the color difference correction processing unit 22a supplies the input image data that has been subjected to the correction processing on the correction target area data to the RGB conversion unit 40a.
  • the RGB conversion unit 40a converts the color space signal of the color system indicating the input image data subjected to the correction process on the correction target area data in the luminance correction processing unit 21a and the color difference correction processing unit 22a into another color system. Convert to color space signal.
  • the RGB conversion unit 40 converts the color space signal (Y, Cb, Cr) of the input image data input in the color system represented by the YCbCr color space into the color space signal (R, G) of the RGB color system. , B). Further, the RGB conversion unit 40a supplies the input image data of the converted color space signal to the noise reduction processing unit 23a.
  • the noise reduction processing unit 23a is provided in the correction processing unit 20a, and removes noise in the luminance signal and the color difference signal included in the correction target region data from the input image data supplied from the RGB conversion unit 40a.
  • the noise reduction processing unit 23a outputs the input image data from which noise has been removed as output image data.
  • correction target area extraction processing (Correction target area extraction processing, target area correction processing)
  • the correction target area extraction process in the present modification is the same as the correction target area extraction process in the first embodiment shown in FIG.
  • the input image data is converted by the RGB conversion unit 40a between the saturation correction process (step S24) and the noise reduction process (step S25) shown in the target area correction process of FIG.
  • This is the same as the target area correction processing according to the first embodiment, except that a step of converting the color space signal of the color system shown into a color space signal of another color system is included.
  • the noise reduction processing unit 23a can perform the noise reduction process using the color space signal of the color system converted from the color system represented by the color space signal of the luminance signal and the color difference signal.
  • FIG. 9 is a block diagram showing details of the configuration of the image correction apparatus 1b according to the present modification.
  • the image correction apparatus 1 b has the same configuration as the image correction apparatus 1 according to the first embodiment except that the image correction apparatus 1 b includes a correction processing unit 20 b instead of the correction processing unit 20.
  • the correction processing unit 20b is means for performing target area correction processing on the correction target area data. Specifically, with respect to the correction target area data extracted by the target area extraction unit 10, the correction content database 32 stored in the storage unit 30 stores the correction content most suitable for the target indicated by the correction target area data. The target area correction processing is executed based on the determined content with reference to the determined contents.
  • the correction processing unit 20b includes a luminance correction processing unit 21b, a color difference correction processing unit 22b, and a noise reduction processing unit 23b.
  • the noise reduction processing unit 23b is provided in the correction processing unit 20a, and removes noise in the luminance signal and the color difference signal of the correction target region data extracted by the target region extraction unit 10 from the input image data.
  • the noise reduction processing unit 23b supplies the input image data from which noise has been removed to the luminance correction processing unit 21b and the color difference correction processing unit 22b.
  • the luminance correction processing unit 21b is provided in the correction processing unit 20b, and the luminance included in the luminance signal of the correction target region data extracted by the target region extraction unit 10 out of the input image data supplied from the noise reduction processing unit 23b. Execute correction processing for information. In addition, the luminance correction processing unit 21b supplies input image data obtained by performing correction processing on the correction target area data to the RGB conversion unit 40b.
  • the color difference correction processing unit 22b is provided in the correction processing unit 20b, and the hue included in the color difference signal of the correction target region data extracted by the target region extraction unit 10 out of the input image data supplied from the noise reduction processing unit 23b. Correction processing for information and saturation information is executed. Further, the color difference correction processing unit 22b supplies an input image signal obtained by performing correction processing on the correction target region data to the RGB conversion unit 40b.
  • correction target area extraction processing (Correction target area extraction processing, target area correction processing)
  • the correction target area extraction process in the present modification is the same as the correction target area extraction process in the first embodiment shown in FIG.
  • the target area correction process in the present modification is the same as the target area correction process shown in FIG. 6, after performing the noise reduction process (step S25), the luminance correction process (step S22), the hue correction process (step S23), and the saturation.
  • the target area correction process is the same as that of the first embodiment except that the correction process (step S24) is performed.
  • the correction processing unit 20b can perform correction on the luminance signal and correction on the color difference signal after the noise reduction processing is performed in the noise reduction processing unit 23.
  • FIG. 10 is a block diagram showing details of the configuration of the image correction apparatus 2 according to this modification.
  • the image correction apparatus 2 has the same configuration as the image correction apparatus 1 according to the first embodiment except that the image correction apparatus 2 further includes a YCbCr conversion unit 41 (second color space signal conversion unit).
  • the YCbCr converter 41 converts the color space signal indicating the input image data into a color space signal of another color system.
  • the YCbCr conversion unit 41 converts an RGB color system color space signal representing input image data into a color system color space signal represented by a YCbCr color space.
  • the YCbCr conversion unit 41 supplies the input image data obtained by converting the color space signal to the target area extraction unit 10 and the correction processing unit 20.
  • correction target area extraction processing (Correction target area extraction processing, target area correction processing)
  • the correction target area extraction process and the target area correction process in the present embodiment are the same as the correction target area extraction process in the first embodiment shown in FIG. 5 and the target area correction process in the first embodiment shown in FIG. Description is omitted.
  • the input image data is a color space signal having a color system different from the color system represented by the YCbCr color space represented by, for example, a luminance signal and a color difference signal
  • the YCbCr color space is used.
  • a color space signal of a color system different from the represented color system can be converted into a luminance signal and a color difference signal representing the YCbCr color space.
  • the input image data can be corrected by correcting the luminance signal and the color difference signal regardless of the color space signal of the color system that the input image data is.
  • FIG. 11 is a block diagram showing details of the configuration of the image correction apparatus 2a according to this modification.
  • the image correction apparatus 2 a includes a correction processing unit 20 a instead of the correction processing unit 20, an RGB conversion unit 40 a instead of the RGB conversion unit 40, and a YCbCr conversion unit 41. Except for this, the configuration is the same as that of the image correction apparatus 1 according to the first embodiment.
  • the YCbCr conversion unit 41 converts the color space signal of the color system indicating the input image data into a color space signal of another color system. For example, the YCbCr conversion unit 41 converts an RGB color system color space signal representing input image data into a color system color space signal represented by a YCbCr color space. Further, the YCbCr conversion unit 41 supplies the input image data obtained by converting the color space signal to the target area extraction unit 10 and the correction processing unit 20a.
  • the correction processing unit 20a is means for performing target area correction processing on the correction target area data. Specifically, with respect to the correction target area data extracted by the target area extraction unit 10, the correction content database 32 stored in the storage unit 30 stores the correction content most suitable for the target indicated by the correction target area data. The target area correction processing is executed based on the determined content with reference to the determined contents.
  • the correction processing unit 20a includes a luminance correction processing unit 21a, a color difference correction processing unit 22a, and a noise reduction processing unit 23a.
  • the luminance correction processing unit 21a is provided in the correction processing unit 20a, and executes correction processing on luminance information included in the luminance signal of the correction target region data extracted by the target region extraction unit 10 from the input image data.
  • the luminance correction processing unit 21a supplies the input image data that has been subjected to the correction process on the correction target region data to the RGB conversion unit 40a.
  • the color difference correction processing unit 22a is provided in the correction processing unit 20a, and executes correction processing on hue information and saturation information included in the color difference signal of the correction target region data extracted by the target region extraction unit 10 from the input image data. To do. Further, the color difference correction processing unit 22a supplies the input image data that has been subjected to the correction processing on the correction target area data to the RGB conversion unit 40a.
  • the RGB conversion unit 40a converts the color space signal of the color system indicating the input image data subjected to the correction process on the correction target area data in the luminance correction processing unit 21a and the color difference correction processing unit 22a into another color system. Convert to color space signal.
  • the RGB converter 40 converts the color space signal (Y, Cb, Cr) of the input image data input in the color system expressed in the YCbCr color space into the color space signal (R, G in the RGB color system). , B). Further, the RGB conversion unit 40a supplies the input image data of the converted color space signal to the noise reduction processing unit 23a.
  • the noise reduction processing unit 23a is provided in the correction processing unit 20a, and removes noise in the luminance signal and the color difference signal included in the correction target region data from the input image data supplied from the RGB conversion unit 40a.
  • the noise reduction processing unit 23a outputs the input image data from which noise has been removed as output image data.
  • correction target area extraction processing (Correction target area extraction processing, target area correction processing)
  • the correction target area extraction process in the present modification is the same as the correction target area extraction process in the first embodiment shown in FIG.
  • the input image data is converted by the RGB conversion unit 40a between the saturation correction process (step S24) and the noise reduction process (step S25) shown in the target area correction process of FIG.
  • This is the same as the target area correction processing according to the first embodiment, except that a step of converting the color space signal of the color system shown into a color space signal of another color system is included.
  • FIG. 12 is a block diagram showing details of the configuration of the image correction apparatus 2b according to this modification.
  • the image correction apparatus 2 b is the same as the image correction apparatus 1 according to the first embodiment except that the image correction apparatus 2 b includes a correction processing unit 20 b instead of the correction processing unit 20 and further includes a YCbCr conversion unit 41. It is a configuration.
  • the YCbCr conversion unit 41 converts the color space signal of the color system indicating the input image data into a color space signal of another color system. For example, the YCbCr conversion unit 41 converts an RGB color system color space signal representing input image data into a color system color space signal represented by a YCbCr color space. Further, the YCbCr conversion unit 41 supplies the input image data obtained by converting the color space signal to the target area extraction unit 10 and the correction processing unit 20b.
  • the correction processing unit 20b is means for performing target area correction processing on the correction target area data. Specifically, with respect to the correction target area data extracted by the target area extraction unit 10, the correction content database 32 stored in the storage unit 30 stores the correction content most suitable for the target indicated by the correction target area data. The target area correction processing is executed based on the determined content with reference to the determined contents.
  • the correction processing unit 20b includes a luminance correction processing unit 21b, a color difference correction processing unit 22b, and a noise reduction processing unit 23b.
  • the noise reduction processing unit 23b is provided in the correction processing unit 20a, and removes noise in the luminance signal and the color difference signal of the correction target region data extracted by the target region extraction unit 10 from the input image data.
  • the noise reduction processing unit 23b supplies the input image data from which noise has been removed to the luminance correction processing unit 21b and the color difference correction processing unit 22b.
  • the luminance correction processing unit 21b is provided in the correction processing unit 20b, and the luminance included in the luminance signal of the correction target region data extracted by the target region extraction unit 10 out of the input image data supplied from the noise reduction processing unit 23b. Execute correction processing for information. In addition, the luminance correction processing unit 21b supplies input image data obtained by performing correction processing on the correction target area data to the RGB conversion unit 40b.
  • the color difference correction processing unit 22b is provided in the correction processing unit 20b, and the hue included in the color difference signal of the correction target region data extracted by the target region extraction unit 10 out of the input image data supplied from the noise reduction processing unit 23b. Correction processing for information and saturation information is executed. Further, the color difference correction processing unit 22b supplies an input image signal obtained by performing correction processing on the correction target region data to the RGB conversion unit 40b.
  • correction target area extraction processing (Correction target area extraction processing, target area correction processing)
  • the correction target area extraction process in the present modification is the same as the correction target area extraction process in the first embodiment shown in FIG.
  • the target area correction process in the present modification is the same as the target area correction process shown in FIG. 6, after performing the noise reduction process (step S25), the luminance correction process (step S22), the hue correction process (step S23), and the saturation.
  • the target area correction process is the same as that of the first embodiment except that the correction process (step S24) is performed.
  • Each block of the image correction apparatus 1 may be realized in hardware by a logic circuit formed on an integrated circuit (IC chip), or may be realized in software using a CPU (Central Processing Unit). Good.
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the image correction apparatus 1 includes a CPU that executes instructions of a program that realizes each function, a ROM (Read Memory) that stores the program, a RAM (Random Access Memory) that expands the program, the program,
  • a storage device (recording medium) such as a memory for storing various data is provided.
  • An object of the present invention is a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the image correction apparatus 1 which is software for realizing the functions described above is recorded so as to be readable by a computer. This can also be achieved by supplying the image correction apparatus 1 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
  • Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R.
  • IC cards including memory cards
  • semiconductor memories such as mask ROM / EPROM / EEPROM / flash ROM, or PLD (Programmable logic device) or FPGA (Field Programmable Gate Array) Logic circuits can be used.
  • the program code may be supplied to the image correction apparatus 1 via a communication network.
  • the communication network is not particularly limited as long as it can transmit the program code.
  • the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network (Virtual Private Network), telephone line network, mobile communication network, satellite communication network, etc. can be used.
  • the transmission medium constituting the communication network may be any medium that can transmit the program code, and is not limited to a specific configuration or type.
  • wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL (Asymmetric Digital Subscriber Line) line, infrared such as IrDA or remote control, Bluetooth (registered trademark), IEEE 8021 wireless, HDR (High Data) Rate), NFC (Near Field Communication), DLNA (Digital Living Network Alliance), mobile phone network, satellite line, terrestrial digital network, and the like.
  • wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL (Asymmetric Digital Subscriber Line) line, infrared such as IrDA or remote control, Bluetooth (registered trademark), IEEE 8021 wireless, HDR (High Data) Rate), NFC (Near Field Communication), DLNA (Digital Living Network Alliance), mobile phone network, satellite line, terrestrial digital network, and the like.
  • wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL (Asymmetric Digital Subscriber Line) line, infrared such as IrDA or remote control, Bluetooth (registered trademark), IEEE
  • an image correction apparatus refers to an input image stored in a storage unit by referring to a target feature database including feature data indicating features specific to a correction target region.
  • a target area extracting means for extracting target area data including specific features from the data
  • a correction content database stored in a storage unit and defined as correction contents for the target area data
  • the input image And correction processing means for performing image correction on the target area data extracted by the target area extraction means among the data.
  • the image correction apparatus refers to a target feature database that includes feature data indicating features specific to a region extracted as target region data, stored in a storage unit, and thereby from input image data.
  • a target area extracting means for extracting target area data including the feature data and a correction content database in which correction contents for the target area data are stored, the input image data is stored.
  • the target area data when the target area data is extracted from the input image data, data corresponding to feature data indicating characteristics specific to the correction target area included in the target feature database is used as the target area data. Can be extracted.
  • input image data applicable to the feature data for example, when a face area is extracted as the target area data, the same color as the partial area of the body or the face of the same color as the face area It is possible to prevent erroneous detection of the wall area of the camera.
  • correction based on the correction content included in the correction content database can be performed on the extracted target area data.
  • the content of the correction may be incompatible with the area other than the correction target area while being compatible with the characteristics of the correction target area. Therefore, since the data other than the target area data is not subjected to inadequate correction, the same correction is uniformly performed on the entire input image data, thereby preventing an unnatural output image.
  • target area data to be corrected can be extracted from the input image data, and correction suitable for the extracted target area data can be performed.
  • the input image data includes still image data or moving image data, as well as a case where a still image is displayed on a part of a moving image or a moving image is displayed on a part of a still image.
  • a combination of still image data and moving image data is also included.
  • the target feature database in the image correction apparatus includes a plurality of types of feature data including the feature data, and the correction content database includes a plurality of types of correction content for each of the plurality of types of target region data.
  • the target area extracting unit extracts a plurality of types of target area data corresponding to the plurality of types of feature data, and the correction processing unit performs image correction corresponding to each of the plurality of types of target area data. Preferably it is done.
  • the plurality of types of target area data can be extracted, and image correction corresponding to each of the extracted types of target area data can be performed. That is, a plurality of correction target areas can be corrected, and more natural correction can be performed on the input image data.
  • the target area data includes a luminance signal and a color difference signal
  • the correction content indicates a luminance correction content indicating a correction content for the luminance signal and a correction content for the color difference signal. It is preferable to include at least one of color difference correction contents and noise reduction contents indicating correction contents for both the luminance signal and the color difference signal.
  • the correction processing means in the image correction apparatus includes a luminance correction processing means for correcting the luminance signal of the target area data based on the luminance correction content, and the target based on the color difference correction content.
  • a color difference correction processing unit that corrects the color difference signal of the area data, a luminance signal supplied from the luminance correction processing unit based on the noise reduction content, and a color difference signal supplied from the color difference correction processing unit
  • a noise reduction processing unit that performs correction for both, and the color signal system in which the luminance signal and the color difference signal of the input image data supplied from the correction processing unit are represented by the luminance signal and the color difference signal It is preferable to further include first color space signal conversion means for converting to a color space signal of a different color system.
  • the correction processing unit includes the luminance correction processing unit, the color difference correction processing unit, and the noise reduction processing unit
  • the correction for the luminance value indicated by the luminance signal, the color difference signal is The correction for the hue value and the saturation value shown, and the correction for the luminance value, the hue value and the saturation value can be respectively performed according to the characteristics of the correction target area, and the input image data Can be corrected naturally.
  • the target area extraction unit and the correction processing unit can perform processing based on the luminance signal and the color difference signal, the luminance signal and the color difference signal constituting the input image data are converted into the luminance signal. Processing can be performed without conversion to a color space signal of a color system different from the color system represented by the signal and the color difference signal.
  • the color system represented by the color space signal of the luminance signal and the color difference signal is, for example, a color system represented by a YCbCr color space.
  • different color system color space signals include, for example, CIEL * a * b * color system color space signals (L * signal, a * signal, b * signal) and RGB color system colors. Spatial signals (R (Red: red), G (Green: green), B (Blue: blue))) can be used.
  • the luminance signal and the color difference signal processed by the correction processing means are different from the color system represented by the luminance signal and the color difference signal. It can be output as output image data after being converted into a color space signal of a color system (for example, an RGB signal of an RGB color system).
  • the correction processing means in the image correction apparatus includes a luminance correction processing means for correcting the luminance signal of the target area data based on the luminance correction content, and the target based on the color difference correction content.
  • Color difference correction processing means for correcting the color difference signal of the area data
  • noise reduction processing means for correcting the color space signal supplied from the first color space signal conversion means based on the noise reduction content;
  • the first color space signal conversion unit converts the luminance signal and the color difference signal of the input image data supplied from the correction processing unit to a color system different from the color system represented by the luminance signal and the color difference signal. It is preferable to convert to a system color space signal.
  • the noise reduction processing unit performs noise reduction processing using the color space signal of the color system converted from the color system represented by the color space signal of the luminance signal and the color difference signal. It can be carried out.
  • the correction processing means includes a noise reduction processing means for correcting the luminance signal and the color difference signal based on the noise reduction content, and the noise based on the luminance correction content.
  • Luminance correction processing means for correcting the luminance signal supplied from the reduction processing means; color difference correction processing means for correcting the color difference signal supplied from the noise reduction processing means based on the color difference correction content;
  • the luminance signal and the color difference signal can be corrected.
  • the image correction apparatus includes a color space signal of input image data input by a color space signal of a color system different from the color system represented by the luminance signal and the color difference signal, the luminance signal and the color signal. It is preferable to further include second color space signal conversion means for converting to a color difference signal.
  • the input image data is a color space signal of a color system different from the color system represented by the YCbCr color space represented by the luminance signal and the color difference signal
  • a color space signal of a color system different from the color system represented by the YCbCr color space can be converted into the luminance signal and the color difference signal representing the YCbCr color space.
  • the input image data can be corrected by correcting the luminance signal and the color difference signal.
  • the brightness correction processing means in the image correction apparatus according to the present invention is preferably a band pass filter and a high pass filter.
  • the bandpass filter corrects the data indicating the contour of the specific target included in the target region data, and the high pass filter includes the specific target included in the target region data. It is possible to correct the data indicating the texture. As a result, necessary correction can be performed on the target area data for each data included in the target area data, so that the same correction is performed on the entire target area data, resulting in an unnatural output image. Can be prevented.
  • the color difference correction processing means in the image correction apparatus according to the present invention performs a correction operation on the saturation and hue in the CbCr coordinate system.
  • the saturation and hue values can be corrected in the appropriate range of the coordinate area so that the input image data becomes a natural image in the CbCr coordinate system.
  • the noise reduction processing means in the image correction apparatus according to the present invention is preferably a low-pass filter or a median filter.
  • the noise reduction processing means when the noise reduction processing means is a low-pass filter, the target area data can be corrected by weighted average processing, and the noise reduction processing means is a median filter. Correction that removes fine noise can be performed.
  • the image correction display device preferably includes the image correction device and a display unit that displays the input image data corrected by the image correction device.
  • An image correction method of an image correction apparatus is an image correction method of an image correction apparatus that corrects an input image, and is feature data that is stored in a storage unit and indicates characteristics specific to a correction target region.
  • a target region extraction step for extracting target region data including specific features from input image data, and correction contents for the target region data stored in the storage unit are determined.
  • a correction processing step of performing image correction on the target area data extracted in the target area extraction step in the input image data by referring to the correction content database.
  • the input image data corresponding to the feature data indicating the characteristic specific to the correction target area included in the target feature database is converted into the target area. It can be extracted as data. As a result, the same effects as those of the image correction apparatus described above can be obtained.
  • a program for causing a computer to operate as the image correction apparatus according to the present invention, which causes the computer to function as each unit of the image correction apparatus, and a computer recording the program A readable recording medium is also included in the scope of the present invention.
  • the image correction apparatus can be suitably applied to a television receiver, a personal computer, a car navigation system, a mobile phone, a digital camera, a digital video camera, and the like.
  • Target area extraction unit target area extraction means
  • Correction processing unit correction processing means
  • Luminance correction processing unit luminance correction processing means
  • Color difference correction processing section color difference correction processing means
  • Noise Reduction Processing Unit Noise Reduction Processing Means
  • Storage Unit 31 Target Feature Database 32
  • Correction Content Database 40
  • RGB Conversion Unit First Color Space Signal Conversion Unit
  • YCbCr converter second color space signal converter

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

La présente invention se rapporte à un dispositif de correction d'image (1) comprenant : un module d'extraction de zone cible (10), qui se réfère à une base de données de caractéristiques cible (31) contenant des données caractéristiques qui désignent des caractéristiques uniques à une zone devant être corrigée, ceci dans le but d'extraire des données de zone cible comprenant les caractéristiques uniques à partir de données d'image d'entrée ; et un module d'exécution d'opération de correction (20), qui se réfère à une base de données de détails de correction (32) dans laquelle des détails de correction correspondant aux données de zone cible ont été déterminés, ceci dans le but de réaliser une opération de correction d'image pour les données de zone cible extraites par le module d'extraction de zone cible (10).
PCT/JP2012/050582 2011-01-20 2012-01-13 Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image, programme et support d'enregistrement WO2012099013A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-010178 2011-01-20
JP2011010178 2011-01-20

Publications (1)

Publication Number Publication Date
WO2012099013A1 true WO2012099013A1 (fr) 2012-07-26

Family

ID=46515641

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/050582 WO2012099013A1 (fr) 2011-01-20 2012-01-13 Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image, programme et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2012099013A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014080613A1 (fr) * 2012-11-22 2014-05-30 日本電気株式会社 Programme, procédé et dispositif de correction de couleur
JP2016173716A (ja) * 2015-03-17 2016-09-29 株式会社ジェイメック 肌画像解析装置、画像処理装置及びコンピュータプログラム
CN110766639A (zh) * 2019-10-30 2020-02-07 北京迈格威科技有限公司 图像增强方法、装置、移动设备及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0678320A (ja) * 1992-08-25 1994-03-18 Matsushita Electric Ind Co Ltd 色調整装置
JP2001292390A (ja) * 2000-04-04 2001-10-19 Minolta Co Ltd 画像処理装置
JP2003348614A (ja) * 2002-03-18 2003-12-05 Victor Co Of Japan Ltd 映像補正装置及び方法、並びに、映像補正プログラム及びこれを記録した記録媒体
JP2007164628A (ja) * 2005-12-15 2007-06-28 Canon Inc 画像処理装置及び方法
JP2010193381A (ja) * 2009-02-20 2010-09-02 Panasonic Corp 信号処理装置および信号処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0678320A (ja) * 1992-08-25 1994-03-18 Matsushita Electric Ind Co Ltd 色調整装置
JP2001292390A (ja) * 2000-04-04 2001-10-19 Minolta Co Ltd 画像処理装置
JP2003348614A (ja) * 2002-03-18 2003-12-05 Victor Co Of Japan Ltd 映像補正装置及び方法、並びに、映像補正プログラム及びこれを記録した記録媒体
JP2007164628A (ja) * 2005-12-15 2007-06-28 Canon Inc 画像処理装置及び方法
JP2010193381A (ja) * 2009-02-20 2010-09-02 Panasonic Corp 信号処理装置および信号処理方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014080613A1 (fr) * 2012-11-22 2014-05-30 日本電気株式会社 Programme, procédé et dispositif de correction de couleur
US9462160B2 (en) 2012-11-22 2016-10-04 Nec Corporation Color correction device, method, and program
JPWO2014080613A1 (ja) * 2012-11-22 2017-01-05 日本電気株式会社 色補正装置、色補正方法および色補正用プログラム
JP2016173716A (ja) * 2015-03-17 2016-09-29 株式会社ジェイメック 肌画像解析装置、画像処理装置及びコンピュータプログラム
CN110766639A (zh) * 2019-10-30 2020-02-07 北京迈格威科技有限公司 图像增强方法、装置、移动设备及计算机可读存储介质
CN110766639B (zh) * 2019-10-30 2022-09-27 北京迈格威科技有限公司 图像增强方法、装置、移动设备及计算机可读存储介质

Similar Documents

Publication Publication Date Title
EP1811765B1 (fr) Procédé et système d'amélioration de couleur, système d'ajustement de couleur et procédé de conversion de gamme de couleur
CN105915909B (zh) 一种高动态范围图像分层压缩方法
US7933469B2 (en) Video processing
WO2013185449A1 (fr) Procédé d'amélioration d'image, dispositif d'amélioration d'image et dispositif d'affichage
EP3429180B1 (fr) Procédé et système de mappage de gamme de couleurs
US9449375B2 (en) Image processing apparatus, image processing method, program, and recording medium
JP2013254390A (ja) 画像処理装置及び画像処理方法
CN113132696A (zh) 图像色调映射方法、装置、电子设备和存储介质
WO2018100950A1 (fr) Dispositif de traitement d'image, caméra numérique, programme de traitement d'image et support d'enregistrement
WO2012153661A1 (fr) Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image et support d'enregistrement
WO2012099013A1 (fr) Dispositif de correction d'image, dispositif d'affichage de correction d'image, procédé de correction d'image, programme et support d'enregistrement
CN109636739B (zh) 图像饱和度增强的细节处理方法及装置
JP2016177500A (ja) 画像処理装置、画像処理システムおよびプログラム
CN110298812B (zh) 一种图像融合处理的方法及装置
WO2023241339A1 (fr) Procédé et appareil de correction de dominante de couleur, dispositif, support de stockage et produit de programme
JP2011076302A (ja) 輪郭抽出装置、輪郭抽出方法、および輪郭抽出プログラム
Nakajima et al. A novel color image processing scheme in HSI color space with negative image processing
JP4375580B2 (ja) 画像処理装置、画像処理方法、および画像処理プログラム
TWI531246B (zh) Color adjustment method and its system
JP4111689B2 (ja) 画像処理装置、画像処理プログラムを記録したコンピュータ読取可能な記録媒体およびプログラム
CN112581390B (zh) 一种图像色彩增强方法、装置、设备及可读存储介质
CN113781330A (zh) 图像处理方法、装置及电子系统
US8873862B2 (en) Product image processor, product image processing method, information recording medium, and program
CN108876800B (zh) 一种信息处理方法和设备
CN110809145A (zh) 基于Craik-O’Brien效应的图像亮度变换方法、装置和设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12736764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12736764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP