JP4655210B2 - Density correction curve generation method and density correction curve generation module - Google Patents

Density correction curve generation method and density correction curve generation module Download PDF

Info

Publication number
JP4655210B2
JP4655210B2 JP2005196341A JP2005196341A JP4655210B2 JP 4655210 B2 JP4655210 B2 JP 4655210B2 JP 2005196341 A JP2005196341 A JP 2005196341A JP 2005196341 A JP2005196341 A JP 2005196341A JP 4655210 B2 JP4655210 B2 JP 4655210B2
Authority
JP
Japan
Prior art keywords
density correction
correction coefficient
density
dependent
correction curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2005196341A
Other languages
Japanese (ja)
Other versions
JP2007018073A (en
Inventor
俊策 利弘
伊公子 吉田
Original Assignee
ノーリツ鋼機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ノーリツ鋼機株式会社 filed Critical ノーリツ鋼機株式会社
Priority to JP2005196341A priority Critical patent/JP4655210B2/en
Publication of JP2007018073A publication Critical patent/JP2007018073A/en
Application granted granted Critical
Publication of JP4655210B2 publication Critical patent/JP4655210B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a technique for generating a density correction curve used for density correction of an input photographed image including a face area.

  At present, in the photographic printing industry, it has been obtained by digitizing a photographed image data obtained by digitizing a photographed image formed on a photographic film using a film scanner or a digital photographing device such as a digital camera. The captured image data (hereinafter simply referred to as a captured image) is subjected to image processing such as density correction and color correction, then converted to print data, and the print exposure unit is driven based on the print data to capture the captured image. The mainstream is digital photographic processing technology that prints on a photosensitive material (photographic paper).

  In the field of digital photographic processing technology, for example, good image quality is obtained based on R (red), G (green), and B (blue) captured images for each frame obtained by photometry of a photographic film with an image sensor. In order to print a simple image on photographic paper, a process for correcting the density of the input captured image is performed. As a density correction method, density correction by a density correction curve (gamma correction) is generally used. In other words, since the density gradation that the photographic paper develops does not match the density gradation of the input photographed image, the density gradation that the photographic paper develops is set to be suitable for human visual characteristics. The input image data is corrected using the correction curve, and the density gradation that is colored on the photographic print (printing paper) finally obtained based on the corrected output image data corresponds to the human visual characteristics. I try to be something. As a density correction curve, a base density correction curve created based on theoretical and empirical knowledge is well known, and such a base density correction curve is adjusted by a density correction coefficient. While being used. However, the subject situation and the shooting environment situation of the shot image are various, and appropriate output image data cannot be obtained depending on the shooting scene of the input shot image data with the simply corrected density correction curve. The problem has arisen. For example, when correcting photographic image data captured from photographic film taken with over / under exposure (overexposure / underexposure), the brightness of each pixel constituting the photographic image data is low (shadow) or high. The output density is very weak with respect to the input density because the slope of the shadow part and highlight part in the basic density correction curve is moderate, while the brightness (highlight part) is too biased. Will be corrected. In particular, when a photographed image includes a person image, that is, a face area, the face area is not properly corrected, and it is difficult to improve the quality of the person photograph.

  In order to perform appropriate density correction on a human photographed image including such a face area, it is highly likely that a person has a feature amount of RGB data of a pixel having a hue corresponding to the skin color of the person. Some have been devised so that the density of a person is appropriately corrected by correcting the density correction curve based on the feature values of the RGB data of the pixels in the region (see Patent Document 1). Although this technique can improve a person-photographed image with a certain degree of probability, the skin color pixels do not uniquely define a person, so that an inappropriate density correction is performed depending on the subject.

  A technique is also known in which an operator designates a face area from captured image data and performs density correction based on the face area (see, for example, Patent Document 2 and Patent Document 3). When the density correction is performed by correcting the basic density correction curve so as to match, the balance between the human face and the other background is often lost and an unnatural photo print is output.

  Further, the difference between the face-dependent density characteristic curve (face-dependent density correction curve) generated based on the face average density value of the face area detected from the input photographed image and the basic density characteristic curve (basic density correction curve) There is also known a technique for determining one correction value and correcting a basic density correction curve based on the first correction value (see Patent Document 4). In this prior art, when a density correction curve finally used for density correction can have a harmonious characteristic between a basic density correction curve and a face-dependent density correction curve, a good person photo print However, there is still a problem in achieving harmony between the basic density correction curve and the face-dependent density correction curve with high reliability.

JP 2002-369020 A (paragraph number 0004-0008, FIG. 5) JP-A-6-59353 (paragraph numbers 0024 and 0048, FIG. 3) JP 2001-218047 A (paragraph numbers 0009-0010, 0024-0028, FIG. 2) Japanese Patent Laying-Open No. 2005-159387 (paragraph number 0009, FIG. 4)

  In view of the above situation, an object of the present invention is to make it possible to perform optimum density correction on a captured image including a human face by achieving harmony between a basic density correction curve and a face-dependent density correction curve. It is to provide a technique for generating a simple density correction curve.

  In order to solve the above problem, in a density correction curve generation method for generating a density correction curve used for density correction of an input photographed image including a face region, the method according to the present invention is based on a histogram of the entire input photographed image. A step of determining a basic density correction coefficient based on the obtained representative value, a step of calculating a face average density value that is an average density value of the face area, and an appropriate relationship between the face average density and the density correction coefficient in advance Determining a face-dependent density correction coefficient corresponding to the face average density value using the set model formula, and comparing the basic density correction coefficient and the face-dependent correction coefficient to evaluate their suitability A step of correcting the face-dependent correction coefficient in accordance with the fitness, and fusing the corrected face-dependent correction coefficient and the basic density correction coefficient at a predetermined fusion rate. Comprising a step of generating the density correction curve.

  In this density correction curve generation method, a face-dependent density correction coefficient corresponding to the face average density value of the face area in the input photographed image is used by using a preset model formula representing an appropriate relationship between the face average density and the density correction coefficient. This face-dependent density correction coefficient is compared with the basic density correction coefficient determined based on the representative value obtained from the histogram of the entire input photographed image, and its fitness, that is, the degree of difference between the two is evaluated. The face-dependent density correction coefficient is modified to maintain the harmony between the entire captured image and the face area as much as possible in accordance with the degree of fitness, and the face-dependent correction coefficient thus corrected and the original basic density correction coefficient are determined in advance. Since the input photographed image is subjected to density correction using the density correction curve finally generated by fusing at the fusion rate, density correction in harmony between the face and the background is realized.

  In a person-captured image including a face area, the manner in which the face and the background are harmonized also differs depending on whether the scene is backlit, forward light, top light, or the like. Therefore, in one preferred embodiment according to the present invention, the face-dependent correction coefficient is determined by comparing the density of the face area and the input captured image other than the face area, and the brightness balance between the face and the background and the input shooting. Further correction is made based on the shooting scene information obtained from the image. A method for obtaining shooting scene information from an input shot image can be obtained from a density (brightness) distribution or the like. If the input shot image has information about a shooting scene as its attribute information (exif data, etc.), It can also be used.

  There is a scene where spot light exists in the face area as a more specific shooting scene in human photography. Since such a scene can be detected from the density distribution of the face area, there is spot light on the face. It is also proposed as one of the preferred embodiments to correct the face-dependent correction coefficient in accordance with the degree of presence of the spot light that is the certainty.

  Naturally, even if it is simply a portrait, the importance of the face differs depending on the overall face ratio. In the case of a captured image in which the ratio indicated by the face area is large, density correction that is limited to the face area is required. For this reason, in one suitable embodiment of the present invention, it is constituted so that the fusion rate may be adjusted from the ratio of the face area to the input photographed image.

  In the present invention, a program for causing a computer to execute the above-described density correction curve generation method and a medium on which the program is recorded are also subject to rights.

  Furthermore, in the present invention, a density correction curve generation module that implements the above-described density correction curve generation method is also subject to rights, and such a density correction curve generation module has a representative value obtained from a histogram of the entire input photographed image. A basic density correction coefficient determination unit that determines a basic density correction coefficient based on the image, a face average density calculation unit that calculates a face average density value that is an average density value of a face area included in the input captured image, and a face average density A face-dependent density correction coefficient determining unit that determines a face-dependent density correction coefficient corresponding to the face average density value using a preset model expression representing an appropriate relationship between the density correction coefficient and the basic density correction coefficient; A face-dependent correction coefficient correcting unit that compares the face-dependent correction coefficient and evaluates its suitability and modifies the face-dependent correction coefficient according to the suitability; The face dependent correction coefficient by fusing the basic density correction coefficient at a predetermined fusion rate is composed of a density correction curve generator for generating the density correction curve. Naturally, such a density correction curve generation module can also obtain all the effects described in the above-described density correction curve generation method, and can also incorporate the above-described preferred embodiment.

Note that the expression of fusing the face-dependent correction coefficient and the original basic density correction coefficient at a predetermined fusion rate to generate the final density correction curve is the result of directly fusing both correction coefficients and final density correction. In addition to generating a curve, the face-dependent correction curve defined by the modified face-dependent correction coefficient and the basic density correction curve defined by the original basic density correction coefficient are merged at a predetermined fusion rate, and finally It also means that a simple density correction curve is generated.
Also, the term density correction curve used here is not limited to a mathematical correction curve, but the entity may be a discrete correction curve or often used in software. It may be defined by the tabulated extractable data used, and it specifies a density characteristic that converts input density to output density in the sense of software and / or hardware. Means all data structures.
Other features and advantages of the present invention will become apparent from the following description of embodiments using the drawings.

An embodiment of a technique for generating an appropriate density correction curve used when correcting the density of a captured image including a face area according to the present invention will be described with reference to the drawings.
FIG. 1 is an external view showing a photographic printing apparatus employing a density correction curve generation technique according to the present invention. This photographic printing apparatus is a printing station as a photographic printer that performs exposure processing and development processing on photographic paper P. 1B and an operation station 1A that processes a captured image taken from an image input medium such as a developed photographic film 2a or a digital camera memory card 2b and generates / transfers print data used in the print station 1B. It is configured.

  This photo printing apparatus is also called a digital minilab. As can be understood from FIG. 2, the printing station 1B pulls out the roll-shaped printing paper P stored in the two printing paper magazines 11 and prints it with the sheet cutter 12. The back print unit 13 prints print processing information such as color correction information and frame number on the back side of the photographic paper P, and the print exposure unit 14 cuts the print paper P into the size. A photographed image is exposed on the surface of the photographic paper P, and the exposed photographic paper P is sent to a processing tank unit 15 having a plurality of development processing tanks for development processing. After drying, the photographic paper P, that is, the photographic prints P, sent to the sorter 17 from the transverse feed conveyor 16 at the upper part of the apparatus is collected in a plurality of trays of the sorter 17 in a state of being sorted in order units (see FIG. 1). ).

  A photographic paper transport mechanism 18 is laid to transport the photographic paper P at a transport speed in accordance with various processes for the photographic paper P described above. The photographic paper transport mechanism 18 is composed of a plurality of nipping and transporting roller pairs including a chucker type photographic paper transport unit 18a disposed before and after the print exposure unit 14 in the photographic paper transport direction.

  The print exposure unit 14 applies R (red), G (green), and B (blue) to the printing paper P conveyed in the sub-scanning direction based on print data from the operation station 1A along the main scanning direction. A line exposure head for irradiating laser beams of the three primary colors (1) is provided. The processing tank unit 15 includes a color developing tank 15a for storing a color developing processing liquid, a bleach-fixing tank 15b for storing a bleach-fixing processing liquid, and a stabilizing tank 15c for storing a stable processing liquid.

  A film scanner 20 for obtaining photographed image data (hereinafter simply referred to as image data) from a photographed image frame of the photographic film 2a is disposed at the upper position of the desk-like console of the operation station 1A. A media reader 21 that acquires captured images as image data from various semiconductor memories and CD-Rs used as the mounted captured image recording medium 2b is incorporated in a general-purpose personal computer that functions as the controller 3 of the photographic printing apparatus. It is. The general-purpose personal computer is also connected with a monitor 23 for displaying various types of information, and a keyboard 24 and a mouse 25 as pointing devices used as operation input units used for various settings and adjustments.

  The controller 3 of this photographic printing apparatus uses a CPU as a core member and constructs a functional unit for performing various operations of the photographic printing apparatus by hardware and / or software, as shown in FIG. As described above, the functional unit particularly related to the present invention includes an image input unit 31 that captures a captured image read by the scanner 20 or the media reader 21 and performs preprocessing necessary for the next processing, and various windows. And a graphic user interface (hereinafter abbreviated as GUI) for generating a control command from a user operation input (via the keyboard 24, mouse 25, etc.) through the graphic operation screen. The GUI unit 33 to be constructed and the control command sent from the GUI unit 33 Density correction, color correction, photo retouch processing, and the like for the captured image transferred from the image input unit 31 to the memory 30 in order to generate desired print data based on an operation command input from the keyboard or directly from the keyboard 24 or the like. A video for displaying on the monitor 23 the graphic data sent from the GUI management unit 32, the simulated image as the print source image, the predicted finished print image, and the GUI unit 33 during the pre-judgment printing operation such as color correction. A video control unit 35 that generates a signal, a print data generation unit 36 that generates print data suitable for the print exposure unit 14 installed in the print station 1B based on processed image data that has undergone image processing, and a customer The original shot image and the image processed image are processed according to the request of Etc. formatter 37 that formats the format for writing the image as image data to the CD-R and the like.

  When the photographic image recording medium is the film 2a, the image input unit 31 separately sends the scan data for the pre-scan mode and the main scan mode to the memory 30, and performs preprocessing according to each purpose. When the captured image recording medium is the memory card 2b, when the captured image data includes thumbnail image data (low resolution data), this data is used for the purpose of displaying a list on the monitor 23. For this reason, the captured image is sent to the memory 30 separately from the main data (high resolution data), but if thumbnail image data is not included, a reduced image is created from the main data and sent to the memory 30 as low resolution image data.

  The image management unit 32 develops in the memory 30 a face detection module 40 that outputs face detection information including the position and size of the face area detected from the low-resolution captured image developed in the memory 30 using a face detection algorithm. A density correction module 50 that performs density correction on the captured image with high resolution, and a density correction curve that the density correction module 50 uses in density correction processing is based on the characteristics of the face area included in the captured image. A density correction curve generation module 60 that generates appropriately, and an image processing module 70 that performs image processing such as color correction and filtering (blurring, sharpness, etc.) on the density-corrected captured image are provided.

  The face detection module 40 can use a general-purpose one. Here, based on a face detection algorithm, an area considered as a face in a photographed image is detected, and the face position and size (based on the face position) are detected. That output face detection information (such as the vertical and horizontal sizes of the rectangular pixel area). Many face detection algorithms for detecting a face from image data are known. For example, JP-A-11-339084, JP-A-2000-99722, and JP-A-2000-22929 are referred to. In addition, this face detection may be performed manually by an operator instead of automatically. In this case, the face detection module 40 does not have a face detection algorithm and only outputs face detection information based on the operation input of the operator. It becomes the composition of.

  As shown in FIG. 4, the density correction module 50 sets and stores an appropriate density correction curve that is set and adjusted by the density correction curve generation module 60 based on the image characteristics of the captured image. It comprises a density correction unit 51 that executes density correction on a high-resolution captured image using a table 52. In general, the density correction unit 51 is realized by a program, and the density correction curve table 52 is realized by a data structure such as a lookup table. However, the present invention is not limited to this.

  Similarly, the density correction curve generation module 60 is also realized by a program. As shown in FIG. 4, the density correction curve generation module 60 includes a basic density correction coefficient determination unit 61, a face area setting unit 62, a face average density calculation unit 63, a model formula storage unit 64, a face-dependent density correction coefficient determination unit. 65, a face-dependent density correction coefficient correction unit 66, a density correction curve generation unit 67, a fusion rate setting unit 68, and a shooting scene evaluation unit 69.

  The basic density correction coefficient determination unit 61 determines a basic density coefficient using a statistical representative value of an input captured image, for example, a minimum value, a maximum value, an average value, and the like obtained from a histogram of the entire captured image as parameters. This basic density coefficient defines a basic density correction curve, and is input by changing according to the basic density coefficient based on a density correction curve having a predetermined density conversion characteristic (shape) as a reference. A basic density correction curve suitable for the captured image is determined.

  The face area setting unit 62 obtains address information from the face area position in the input photographed image defined by the face detection information output from the face detection module 40, and sends the face area image data to the processing unit that handles the face area pixels. (Pixel value) is transferred. The face average density calculation unit 63 calculates the average density value of the face area (single or plural) based on the transfer data from the face area setting unit 62. In order to calculate the average density value that also expresses the luminance, here, the arithmetic average of the R, G, and B density values constituting each pixel is used as an average arithmetic element, and the entire face area is further arithmetically averaged. It is calculated by. Of course, as other average density values, for example, the face average brightness (density) may be calculated using the brightness Y converted from the R, G, B density values to the brightness / color difference values Y, C, C. . That is, here, the average density value: (R + G + B) / 3 is treated in the same way as the luminance Y in the Y, C, and C signals, and the term average density value in this application is the average luminance (Y). A value is also included.

  The face-dependent density correction coefficient determination unit 65 determines the face-dependent density correction coefficient from the face average density value using the model formula stored in the model formula storage unit 64. A large number of captured images acquired under the conditions are used as samples, and the average image density values and density correction coefficient distributions obtained by considering the distribution of the average face density values and appropriate face areas for each sample. It is a relational expression showing a relationship with a face-dependent density correction coefficient as a density correction coefficient that can be converted. An example of a graph represented by this model formula is shown in FIG. In FIG. 5, this model equation is represented by γf = M (Fave) where the face average density value is Fave and the face-dependent density correction coefficient is γf, and the face-dependent density correction coefficient is a low value of the face average density value. In the area, there is a characteristic that there is little fluctuation at a high level, there is little fluctuation at a low level in the high area of the face average density value, and in the middle area of the face average density value, it decreases from the high level to the low level.

  If the face-dependent density correction coefficient determined by the face-dependent density correction coefficient determination unit 65 and the basic density correction coefficient determined by the basic density correction coefficient determination unit 61 are substantially equal, the density correction coefficient is created using either density correction coefficient. There is no problem because there is no substantial difference in the density correction curves, but if there is a certain difference, the face-dependent density correction coefficient must be corrected. For example, if the density correction coefficient shows a characteristic that brightens the pixels in the face area even though the brightness of the face area is high, the face-dependent density correction coefficient is corrected in a direction that does not brighten the pixels in the face area. There is a need to. The face-dependent density correction coefficient correction unit 66 performs correction based on the degree of matching (difference) between the face-dependent density correction coefficient and the basic density correction coefficient.

  The density correction curve generation unit 67 uses the fusion rate setting unit 68 to combine the face-dependent density correction coefficient output from the face-dependent density correction coefficient correction unit 66 and the basic density correction coefficient output from the basic density correction coefficient determination unit 61. An appropriate density correction curve that is finally used for density correction is generated by fusing using the set fusion rate, and is defined by the face-dependent density correction coefficient output from the face-dependent density correction coefficient correction unit 66. The face-dependent density correction curve and the basic density correction curve defined by the basic density correction coefficient output from the basic density correction coefficient determination unit 61 are merged using the fusion rate set by the fusion rate setting unit 68, and finally. It can also be expressed as generating an appropriate density correction curve used for density correction. For example, it is convenient for the fusion rate setting unit 68 to set the fusion rate so that the face-dependent density correction curve is weighted as the ratio of the face area in the input photographed image increases.

  The shooting scene evaluation unit 69 determines the backlight scene degree by comparing the density of the input captured image other than the face area and the face area, or calculates the spot light presence degree indicating the presence intensity of the spot light in the face area from the density distribution of the face area. The face-dependent density correction coefficient correction unit 66 corrects the face-dependent correction coefficient in accordance with the shooting scene information evaluated by the shooting scene evaluation unit 69, thereby further correcting the face-dependent density correction coefficient. Improves accuracy. The shooting scene evaluation unit 69 can also acquire shooting scene information from Exif data or the like.

A typical process flow of density correction curve generation by the density correction curve generation module as described above will be described with reference to the flowchart of FIG.
When a captured image is input (# 01), a face detection process (# 02) by the face detection module 40 and a basic density correction coefficient determination process (# 03) by the basic density correction coefficient determination unit 61 are performed on the input captured image. ) And are executed. When the input photographed image does not include a face area (# 04 No branch), a basic density correction curve is generated using a basic density correction coefficient as usual (# 05). A density correction process is executed as a density correction curve (# 20).

  When a face area is included in the input captured image (# 04 Yes branch), the face area average density calculation unit calculates the average density of the face area as the face average density value (# 06). When the face average density value is calculated, the face-dependent density correction coefficient determination unit 65 uses a model expression (relational expression) as shown in FIG. 5 to adjust the density correction coefficient (gamma coefficient) adapted to this face average density value. The face-dependent density correction coefficient is determined (# 07). Here, the basic density correction coefficient determined by the basic density correction coefficient determination unit 61 and the face-dependent density correction coefficient determined by the face-dependent density correction coefficient determination unit 65 are compared, that is, it is determined whether there is a big difference. The degree of suitability of whether or not the face-dependent density correction coefficient matches the basic density correction coefficient is evaluated (# 08). It is determined whether the degree of fitness is within an allowable range and the face-dependent density correction coefficient is appropriate (# 09). In the evaluation of the fitness level, in addition to simple numerical comparison, for example, when the correction indicating that the face-dependent density correction coefficient is further brightened even though the brightness of the original face area is high is indicated. The evaluation is judged to be low and inappropriate. If it is determined as inappropriate in the determination of the suitability in step # 09, the face-dependent density correction coefficient is corrected in a direction approaching the normal basic density correction coefficient (# 10).

  Subsequently, it is determined whether or not correction in consideration of the balance between the face and the background is necessary (# 11). For example, if a scene is not a backlight scene but is white and bright and the face brightness is low, the face-dependent density correction coefficient must be corrected so that the face does not become excessively bright. When there is a high possibility of being a scene, it is preferable that the face-dependent density correction coefficient makes the face excessively bright. Also, there are scenes where spot light exists in the face area, and such scenes can be detected from the density distribution of the face area, so there is spot light presence that is the probability that spot light exists on the face. It is also determined whether the face-dependent correction coefficient needs to be corrected according to the degree. When spot light is present in the face area, density correction that is not excessively affected by such spot light is important. In any case, the face-dependent density correction coefficient is further slightly corrected only when such correction considering the balance between the face and the background is necessary (# 12). Information relating to a shooting scene such as a backlight scene can be obtained from the shooting scene evaluation unit 69.

  When the final face-dependent density correction coefficient is determined regardless of whether the correction is made or not, the face-dependent density correction coefficient and the basic density correction coefficient are fused at the fusion rate set by the fusion rate setting unit 68. A final density correction curve is generated by the fused fusion correction coefficient. Expressing this from the perspective of density correction curve fusion, the face-dependent density correction curve defined by the finally determined face-dependent density correction coefficient and the basic density correction curve defined by the basic density correction coefficient have the density The face-dependent density correction curve and the basic density correction curve generated by the correction curve generation unit 67 (# 13) are the final density correction curve with the fusion rate (# 14) set by the fusion rate setting unit 68. (# 15).

  Subsequently, density correction processing is executed as the final basic density correction curve (# 20). Note that the fusion rate is set by the fusion rate setting unit 68 based on the ratio of the face area in the input photographed image, that is, the weight of the face-dependent density correction curve increases as the face area occupation ratio increases. Is convenient. Of course, the fusion rate setting unit 68 may set a fixed fusion rate that is obtained in advance.

  In the above-described embodiment, the print station 1B exposes a photographed image to the photographic paper P by the print exposure unit 14 having an exposure engine, and performs a plurality of development processes on the photographic paper P after the exposure. Although the silver halide photographic printing method has been adopted, of course, the printing station 1B using the density correction curve determination technique according to the present invention is not limited to such a method. For example, ink is applied to film or paper. Various photographic printing methods such as an ink jet printing method in which an image is formed by discharging the ink and a thermal transfer method using a thermal transfer sheet can be employed.

External view of photographic printing apparatus employing density correction curve generation technology according to the present invention Schematic diagram schematically showing the configuration of the print station of the photo printing device Functional block diagram explaining the functional elements built in the controller of the photo printing device Functional block diagram showing the functional configuration of the density correction curve generation module Explanatory drawing schematically showing model formula Flowchart showing the flow of processing in one of the embodiments of the dark density correction curve generation module

Explanation of symbols

30: Memory 40: Face detection module 50: Density correction module 51: Density correction unit 52: Density correction curve table 60: Density correction curve generation module 61: Basic density correction coefficient determination unit 62: Face area setting unit 63: Face average density Calculation unit 64: Model formula storage unit 65: Face-dependent density correction coefficient determination unit 66: Face-dependent density correction coefficient correction unit 67: Density correction curve generation unit 68: Fusion rate setting unit 69: Shooting scene evaluation unit

Claims (6)

  1. In a density correction curve generation method for generating a density correction curve used for density correction of an input photographed image including a face area,
    Determining a basic density correction coefficient based on a representative value obtained from a histogram of the entire input photographed image;
    Calculating a face average density value which is an average density value of the face region;
    Determining a face-dependent density correction coefficient corresponding to the face average density value using a preset model formula representing an appropriate relationship between the face average density and the density correction coefficient;
    Comparing the basic density correction coefficient with the face-dependent correction coefficient and evaluating its fitness;
    Modifying the face-dependent correction coefficient according to the fitness,
    Fusing the modified face-dependent correction coefficient and the basic density correction coefficient at a predetermined fusion rate to generate the density correction curve;
    A density correction curve generation method comprising:
  2. The face-dependent correction coefficient is further corrected based on the brightness balance of the face and the background determined by comparing the densities of the face area and the input photographed image other than the face area, and photographing scene information obtained from the input photographed image. The density correction curve generation method according to claim 1.
  3. 3. The face-dependent correction coefficient is further modified according to a spot light presence level that is a probability that a spot light exists on a face determined from a density distribution of the face region. 2. A method for generating a density correction curve according to 1.
  4. The density correction curve generation method according to claim 1, wherein the fusion rate is adjusted based on a ratio of the face area in the input photographed image.
  5. In a density correction curve generation module for generating a density correction curve used for density correction of an input captured image including a face area,
    A basic density correction coefficient determination unit that determines a basic density correction coefficient based on a representative value obtained from a histogram of the entire input photographed image;
    A face average density calculating unit that calculates a face average density value that is an average density value of the face region;
    A face-dependent density correction coefficient determination unit that determines a face-dependent density correction coefficient corresponding to the face average density value using a preset model formula that represents an appropriate relationship between the face average density and the density correction coefficient;
    Comparing the basic density correction coefficient and the face-dependent correction coefficient to evaluate their suitability and correcting the face-dependent correction coefficient according to the suitability;
    A density correction curve generation module comprising: a density correction curve generation unit that generates the density correction curve by fusing the corrected face-dependent correction coefficient and the basic density correction coefficient at a predetermined fusion rate.
  6. The density correction curve generation module according to claim 5, further comprising a fusion rate setting unit that sets the fusion rate based on a ratio of the face area in the input photographed image.
JP2005196341A 2005-07-05 2005-07-05 Density correction curve generation method and density correction curve generation module Active JP4655210B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005196341A JP4655210B2 (en) 2005-07-05 2005-07-05 Density correction curve generation method and density correction curve generation module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005196341A JP4655210B2 (en) 2005-07-05 2005-07-05 Density correction curve generation method and density correction curve generation module

Publications (2)

Publication Number Publication Date
JP2007018073A JP2007018073A (en) 2007-01-25
JP4655210B2 true JP4655210B2 (en) 2011-03-23

Family

ID=37755213

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005196341A Active JP4655210B2 (en) 2005-07-05 2005-07-05 Density correction curve generation method and density correction curve generation module

Country Status (1)

Country Link
JP (1) JP4655210B2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4934326B2 (en) * 2005-09-29 2012-05-16 富士フイルム株式会社 Image processing apparatus and processing method thereof
JP4823979B2 (en) * 2007-07-23 2011-11-24 富士フイルム株式会社 Image processing apparatus and method, and program
JP4906627B2 (en) 2007-07-31 2012-03-28 キヤノン株式会社 Image processing apparatus, image processing method, computer program, and storage medium
JP4525719B2 (en) 2007-08-31 2010-08-18 カシオ計算機株式会社 Gradation correction apparatus, gradation correction method, and program
JP4600448B2 (en) * 2007-08-31 2010-12-15 カシオ計算機株式会社 Gradation correction apparatus, gradation correction method, and program
JP5031877B2 (en) * 2010-01-06 2012-09-26 キヤノン株式会社 Image processing apparatus and image processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03139976A (en) * 1989-10-25 1991-06-14 Dainippon Screen Mfg Co Ltd Method for setting up tone curve
JPH04285933A (en) * 1991-03-14 1992-10-12 Fuji Photo Film Co Ltd Pictorial hard copying device
JPH06215128A (en) * 1993-01-14 1994-08-05 Sanyo Electric Co Ltd Picture processor
JP2000004393A (en) * 1998-06-16 2000-01-07 Minolta Co Ltd Back light scene determining method, computer-readable storage medium with back light scene determining method program stored therein and image processor having back light scene determining function
JP2002185793A (en) * 2000-12-14 2002-06-28 Noritsu Koki Co Ltd Image forming device, image data processing method and recording medium for recording image data processing program
JP2005159387A (en) * 2003-11-20 2005-06-16 Noritsu Koki Co Ltd Method of determining density characteristic curve and density correction management apparatus for executing this method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03139976A (en) * 1989-10-25 1991-06-14 Dainippon Screen Mfg Co Ltd Method for setting up tone curve
JPH04285933A (en) * 1991-03-14 1992-10-12 Fuji Photo Film Co Ltd Pictorial hard copying device
JPH06215128A (en) * 1993-01-14 1994-08-05 Sanyo Electric Co Ltd Picture processor
JP2000004393A (en) * 1998-06-16 2000-01-07 Minolta Co Ltd Back light scene determining method, computer-readable storage medium with back light scene determining method program stored therein and image processor having back light scene determining function
JP2002185793A (en) * 2000-12-14 2002-06-28 Noritsu Koki Co Ltd Image forming device, image data processing method and recording medium for recording image data processing program
JP2005159387A (en) * 2003-11-20 2005-06-16 Noritsu Koki Co Ltd Method of determining density characteristic curve and density correction management apparatus for executing this method

Also Published As

Publication number Publication date
JP2007018073A (en) 2007-01-25

Similar Documents

Publication Publication Date Title
JP4725057B2 (en) Generation of image quality adjustment information and image quality adjustment using image quality adjustment information
US7133070B2 (en) System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
EP1206118B1 (en) Image processing apparatus, image processing method and recording medium
JP4248812B2 (en) Digital image processing method for brightness adjustment
US6415053B1 (en) Image processing method and apparatus
US6097470A (en) Digital photofinishing system including scene balance, contrast normalization, and image sharpening digital image processing
DE69913534T2 (en) Method and device for image generation
US6396599B1 (en) Method and apparatus for modifying a portion of an image in accordance with colorimetric parameters
US6674544B2 (en) Image processing method and apparatus
JP4406195B2 (en) Output image adjustment of image data
JP3652756B2 (en) Image processing method and apparatus
US6600548B2 (en) Image processing method and apparatus
US7042501B1 (en) Image processing apparatus
US7024035B1 (en) Method of setting region to be subjected to red eye correction and red eye correcting method
US7173732B2 (en) Image processing method
US7609908B2 (en) Method for adjusting the brightness of a digital image utilizing belief values
JP4431949B2 (en) Red-eye correction method and apparatus for carrying out this method
US7324246B2 (en) Apparatus and method for image processing
JP3590265B2 (en) Image processing method
US7602987B2 (en) Defective pixel correcting method, software and image processing system for implementing the method
JP3725454B2 (en) Output image adjustment for image files
US7397969B2 (en) Red eye compensation method, image processing apparatus and method for implementing the red eye compensation method, as well as printing method and printer
US6563531B1 (en) Image processing method
US7289664B2 (en) Method of detecting and correcting the red eye
US7038713B1 (en) Image processing apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080317

TRDD Decision of grant or rejection written
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20101117

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20101125

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20101208

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140107

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 4655210

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140107

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250