JP2005192162A - Image processing method, image processing apparatus, and image recording apparatus - Google Patents

Image processing method, image processing apparatus, and image recording apparatus Download PDF

Info

Publication number
JP2005192162A
JP2005192162A JP2003434706A JP2003434706A JP2005192162A JP 2005192162 A JP2005192162 A JP 2005192162A JP 2003434706 A JP2003434706 A JP 2003434706A JP 2003434706 A JP2003434706 A JP 2003434706A JP 2005192162 A JP2005192162 A JP 2005192162A
Authority
JP
Japan
Prior art keywords
image data
captured image
hue
saturation
gradation conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2003434706A
Other languages
Japanese (ja)
Inventor
Tsukasa Ito
Jo Nakajima
Daisuke Sato
Hiroaki Takano
丈 中嶋
司 伊藤
大輔 佐藤
博明 高野
Original Assignee
Konica Minolta Photo Imaging Inc
コニカミノルタフォトイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Photo Imaging Inc, コニカミノルタフォトイメージング株式会社 filed Critical Konica Minolta Photo Imaging Inc
Priority to JP2003434706A priority Critical patent/JP2005192162A/en
Publication of JP2005192162A publication Critical patent/JP2005192162A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a new image processing method for carrying out highly accurate gradation compression processing and gray balance adjustment, and to provide an image processing apparatus and an image recording apparatus employing the method. <P>SOLUTION: The image recording apparatus 1 disclosed herein acquires the hue value and the saturation value of input image data, generates a two-dimensional histogram denoting an accumulated frequency distribution of pixels within a coordinate plane wherein an x-axis is the hue value (H) and a y-axis is the saturation value (S), divides the two-dimensional histogram into regions each comprising a hue and saturation combination, calculates a lightness deviation amount of prescribed divided regions, that is, a face region being a skin color region over the entire image, determines a gradation conversion curve on the basis of the calculated lightness deviation amount, and applies the determined gradation conversion curve to the input image data to apply gradation conversion processing to the input image data. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

  The present invention relates to an image processing method, an image processing apparatus, and an image recording apparatus that perform image processing on captured image data and output image data optimized for viewing on an output medium.

  Today, scanning images of color photographic film and digital image data taken with an imaging device are distributed via storage devices such as CD-Rs, floppy disks, and memory cards, and the Internet, and CRT (Cathode Ray Tube ), Displayed on a display device such as a liquid crystal display or plasma display or a small liquid crystal display monitor of a mobile phone, or printed as a hard copy image using an output device such as a digital printer, an ink jet printer, or a thermal printer.・ Printing methods are diversified.

In response to these various display / printing methods, efforts have been made to increase the versatility of digital image data. As part of this effort, there is an attempt to standardize the color space represented by the digital RGB signal into a color space that does not depend on the characteristics of the imaging device. Currently, "sRGB" is adopted as the standardized color space for many digital image data. (See "Multimedia Systems and Equipment-Colour Measurment and Management-Part2-1: Color Management-Default RGB Color Space-sRGB" IEC "61966-2-1.) The sRGB color space is that of a standard CRT display monitor. It is set corresponding to the color reproduction area.

  In general, a scanner or a digital camera uses an image sensor (photoelectric conversion function) that combines a CCD (charge coupled device), a charge transfer mechanism, and a checkered color filter to provide color sensitivity. CCD type image sensor, hereinafter simply referred to as “CCD”). Digital image data output by a scanner or digital camera is converted into an electrical original signal converted via a CCD by correcting the photoelectric conversion function of the image sensor (for example, tone correction, spectral sensitivity crosstalk correction, darkness correction). File conversion to a specified data format that has been standardized so that it can be read and displayed by image editing software with current noise suppression, sharpening, white balance adjustment, saturation adjustment, etc. It has undergone compression processing and the like.

  As such a data format, for example, “Baseline Tiff Rev.6.0RGB Full Color Image” adopted as an uncompressed file of an Exif (Exchangeable Image File Format) file and a compressed data file format compliant with the JPEG format are known. It has been. The Exif file conforms to the sRGB, and correction of the photoelectric conversion function of the image sensor is set so as to obtain the most suitable image quality on a display monitor conforming to sRGB.

  For example, in any digital camera, tag information indicating that display is performed in a standard color space of a display monitor compliant with the sRGB signal (hereinafter also referred to as “monitor profile”), the number of pixels, the pixel arrangement, and The digital image data can be displayed on a display monitor as long as it adopts a function for writing additional information indicating model-dependent information such as the number of bits per pixel as metadata in the file header of the digital image data and such a data format. The image editing software (for example, Adobe Photoshop) that displays the tag information can analyze the tag information, prompt the monitor profile to be changed to sRGB, or perform the change process automatically. Therefore, it is possible to reduce the difference between different displays and to view digital image data captured by a digital camera in a suitable state on the display.

  Further, as additional information written in the file header of digital image data, in addition to the above-mentioned model-dependent information, for example, information directly related to the camera type (model) such as camera name and code number, exposure time, shutter Speed, aperture value (F number), ISO sensitivity, brightness value, subject distance range, light source, presence or absence of strobe light, subject area, white balance, zoom magnification, subject composition, shooting scene type, amount of reflected light from strobe light source, Tags (codes) indicating shooting condition settings such as shooting saturation and information on the type of subject are used, and image editing software and output devices read these additional information to improve the quality of hard copy images. A function to be suitable is provided.

  Also for color photographic films, products (APS films) provided with a magnetic recording layer have been developed to provide additional information as described above. However, contrary to the expectation of those skilled in the art, the spread in the market is slow, and the conventional products still occupy the majority. Therefore, image processing using additional information for the scanner read image cannot be expected for the time being. In addition, since the characteristics of color photographic films differ depending on the type, the initial digital minilabs prepared the optimum conditions for the respective characteristics in advance, but in recent years they have been abolished in most models for efficiency. Therefore, there is an increasing demand for a very advanced image processing technique that corrects the difference for each product type and automatically performs image quality improvement comparable to processing using additional information using only film density information.

  Among them, gray balance adjustment that corrects the color temperature change of the photographic light source, backlighting, or gradation compression (gradation conversion) processing in close-up flash photography, etc. are corrected or corrected during photography. It is one of the items that it is desirable to obtain information for doing so. In digital cameras, these corrections can be made at the time of shooting, but in principle it is impossible with color photographic film, and additional information cannot be expected as described above. It's a huge obstacle, and you're still forced to use a number of heuristic algorithms. The processing contents and problems of gradation compression processing and gray balance adjustment at the time of backlighting or close-up flash photography will be described below.

  The main purpose of the gradation compression processing at the time of backlighting or close-up flash photography is to reproduce a person's face with appropriate brightness. Accordingly, it has been demanded to propose a method for compensating for the accuracy of extraction of a face area by scene discrimination between backlight and strobe close-up shooting and, as a result, reproducing the brightness of the face area more appropriately.

  For example, Patent Document 1 describes a method for determining the position and type of a light source at the time of shooting in order to improve the accuracy of face area extraction. In Patent Document 1, as a method for extracting a face candidate region, a method using a two-dimensional histogram of hue and saturation described in Patent Document 2, and pattern matching described in Patent Document 3, Patent Document 4, and Patent Document 5 , Pattern search method etc. are quoted. Further, as a background area removal method other than the face, the ratio of the straight line portion, the line object property, the contact ratio with the outer edge of the screen, the density contrast, the pattern of the density change, and the periodicity described in Patent Document 3 and Patent Document 4 above. The method of discriminating using is cited. A method of using a one-dimensional histogram of brightness is described for determining whether the backlight or the strobe proximity photography is used. This method presupposes an empirical rule that the face area is dark and the background area is bright in the case of a backlight scene, and the face area is bright and the background area is dark in the case of close-up flash photography. That is, the brightness deviation amount is calculated for the extracted face candidate area, and when the deviation amount is large, scene discrimination of backlighting or close-up flash photography is performed, and only when the empirical rule is met, the face candidate area The allowable range of extraction conditions is adjusted.

  Naturally, there is a desire to make the face area, which is the main subject, appropriate brightness, even if it is not backlit or strobe close-up so that the face area is doubtful, and many proposals have been made so far. For example, Patent Document 6 describes a method of grouping adjacent pixels that are close in hue and saturation and calculating the print density from the simple average and the number of pixels of each group. This method adjusts the density of the entire print to suppress the influence of subjects other than the main subject, and does not perform gradation compression processing or weighting limited to the face area.

  The gradation compression processing is defined as a step of calculating an average brightness of an area where a specific subject such as a face is distributed, and defining a gradation conversion curve for converting the calculated average brightness to a desired value. And a step of applying a gradation conversion curve to image data. When calculating the average brightness, it is desirable to adjust the proportion of the brightness of the face area (the contribution ratio of the face area) according to the shooting scene, but to the extent that it is determined whether the backlight or the flash close-up shooting, The contribution rate adjustment is quite limited.

Various proposals have also been made to improve the accuracy of gray balance adjustment. For example, Patent Document 6 describes a method of extracting low-saturation pixels and linearly approximating the equivalent neutral density on the BGR density coordinates. In order to suppress color density deviation called color feria, a method is described in which an RC hue distribution is extracted for the BG correlation and a BY hue distribution pixel is extracted for the RG correlation as the low-saturation pixels. Further, the highlight point and the shadow point are calculated, and points below a certain saturation are extracted as low saturation pixels from the straight line connecting the two. In this case, the rule of thumb is that the highlight and shadow points are low in saturation, but the scenes with large areas occupied by the lawn and the sky, and the high brightness of the skin in the flash close-up scene, the darkness There are many examples that do not fit the rule of thumb, such as floating taillights and aquarium tanks. As a result, in these shooting scenes, a color density deviation called color feria occurs. As a countermeasure, there has been proposed a method of using information held by a plurality of photographing frames instead of calculating gray balance adjustment conditions for each frame. For example, in Patent Document 7, when low-saturation pixels are extracted from all shooting frames and the number of low-saturation pixels extracted from each frame is small, gray balance adjustment conditions from all shooting frames, conditions from one frame, A method of weighting is proposed. In addition, Patent Document 8 proposes a method for performing a discrimination process between a normal scene and an abnormal scene. However, any of these methods mainly deals with addition or selection of information amount.
JP 2000-148980 A JP-A-6-67320 JP-A-8-122944 JP-A-8-184925 JP-A-9-138471 JP-A-9-191474 Japanese Patent Laid-Open No. 11-317880 JP 2001-257896 A

  The problem with the gradation compression processing at the time of backlighting or close-up flash photography is how to compensate for the extraction accuracy of the face area and consequently improve the accuracy of brightness correction of the face area. As described above, the method using a one-dimensional histogram of brightness is considered to bring about a certain effect when accuracy compensation is the main focus. However, it must be said that it is still inadequate for the request to grasp the state of the shooting scene more accurately, rather than meeting a clear definition such as backlighting or close-up flash photography. Furthermore, it goes without saying that correction according to the degree of backlighting or close-up flash photography is desired for the backlight compression or gradation compression processing itself during close-up flash photography.

  Also, the problem with the gray balance adjustment method is that there are extremely few low-saturation pixels extracted from one frame, or shooting scenes with special color schemes such as highlights and shadows that are not low-saturation. It can be said that this is a proposal for a coping method. Therefore, if the coloration tendency of the shooting scene is discriminated and the optimum gray balance adjustment condition is calculated for each shooting scene based on the discrimination result, there seems to be enough room for further accuracy improvement. . Many proposals have also been made regarding white balance adjustment methods for image data taken by a digital camera. However, all of them have proposed a threshold setting for extracting low-saturation pixels, or the application rate of gray balance adjustment itself, and the shooting scene. The relationship with the coloration tendency is not specified. Also, there is no known method for analyzing the coloration tendency of the photographic scene using a two-dimensional or higher histogram.

  In view of the above-described phenomenon, an object of the present invention is to provide a novel image processing method capable of performing gradation compression processing with high accuracy, an image processing apparatus using the same, and an image recording apparatus. It is another object of the present invention to provide a novel image processing method capable of performing highly accurate gradation compression processing and gray balance adjustment, and an image processing apparatus and an image recording apparatus using the same.

In order to solve the above-mentioned problem, the invention described in claim 1
In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Determining a gradation conversion processing condition based on the calculated brightness deviation amount;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
It is characterized by including.

The invention described in claim 2
In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
It is characterized by including.

The invention according to claim 3

In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
Determining a gradation conversion processing condition based on the calculated brightness deviation amount and contribution rate;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
It is characterized by including.

The invention according to claim 4
In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Determining a gradation conversion processing condition based on the calculated brightness deviation amount;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
Dividing the captured image data into predetermined hue regions;
Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
Calculating a gray balance adjustment condition using the extracted low saturation pixel;
Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
It is characterized by including.

The invention described in claim 5
In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
Dividing the captured image data into predetermined hue regions;
Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
Calculating a gray balance adjustment condition using the extracted low saturation pixel;
Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
It is characterized by including.

The invention described in claim 6
In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
Determining a gradation conversion processing condition based on the calculated brightness deviation amount and contribution rate;
Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
Dividing the captured image data into predetermined hue regions;
Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
Calculating a gray balance adjustment condition using the extracted low saturation pixel;
Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
It is characterized by including.

The invention according to claim 7 is the invention according to any one of claims 1 to 3,
Creating a two-dimensional histogram of the acquired hue and saturation values;
The step of dividing the captured image data into a region composed of a predetermined hue and saturation combination is based on the created two-dimensional histogram, and the captured image data is divided into a region composed of a predetermined hue and saturation combination. It is characterized by dividing.

The invention according to claim 8 is the invention according to any one of claims 4 to 6,
Creating a two-dimensional histogram of the acquired hue and saturation values;
The step of dividing the captured image data into regions composed of a predetermined hue and saturation combination is based on the created two-dimensional histogram, and the captured image data is divided into regions composed of a predetermined hue and saturation combination. Split and
The step of dividing the captured image data into predetermined hue regions is characterized in that the captured image data is divided into predetermined hue regions based on the created two-dimensional histogram.

The invention according to claim 9 is the invention according to any one of claims 1 to 3 and 7,
The step of dividing the captured image data into regions composed of a predetermined combination of hue and saturation comprises at least 0 to 69 in the HSV color system and 0 to 128 in the saturation value. It is characterized by dividing the skin color area.

The invention according to claim 10 is the invention according to any one of claims 4 to 6 and 8,
The step of dividing the captured image data into regions composed of a predetermined combination of hue and saturation comprises at least 0 to 69 in the HSV color system and 0 to 128 in the saturation value. Divide into flesh-colored areas,
The step of dividing the captured image data into a predetermined hue area includes dividing the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, and an empty area of 185 to 224 in the hue value of the HSV color system. The hue area is divided into 225 to 360 red hue areas.

The invention according to claim 11 is the invention according to any one of claims 1 to 10,
The determination of the gradation processing condition is performed by creating a gradation conversion curve or by adjusting the gradation conversion curve by selecting from a plurality of preset gradation conversion curves. .

The invention according to claim 12 is the invention according to any one of claims 1 to 11,
The captured image data is scene reference image data.

Invention of Claim 13 in the invention as described in any one of Claims 1-12,
The image data optimized for viewing on the output medium is viewing image reference data.

The invention according to claim 14
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention according to claim 15 is:
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention described in claim 16
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention described in claim 17
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention described in claim 18
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention according to claim 19 is
In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention according to claim 20 is the invention according to any one of claims 14 to 16,
Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
The HS dividing means divides the captured image data into regions composed of combinations of predetermined hue and saturation based on the created two-dimensional histogram.

The invention according to claim 21 is the invention according to any one of claims 17 to 19,
Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
The HS dividing unit divides the captured image data into regions including combinations of a predetermined hue and saturation based on the created two-dimensional histogram,
The hue area dividing unit divides the captured image data into predetermined hue areas based on the created two-dimensional histogram.

The invention according to claim 22 is the invention according to any one of claims 14 to 16 and 20,
The HS dividing means divides the captured image data into skin color regions having at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128.

The invention according to claim 23 is the invention according to any one of claims 17 to 19 and 21,
The HS dividing means divides the captured image data into skin color regions consisting of at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128,
The hue area dividing unit converts the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, a sky hue area of 185 to 224, and a red hue of 225 to 360 in the hue value of the HSV color system. It is characterized by being divided into phase regions.

The invention according to claim 24 is the invention according to any one of claims 14 to 23,
The determination of the gradation processing condition is performed by creating a gradation conversion curve or by adjusting the gradation conversion curve by selecting from a plurality of preset gradation conversion curves. .

The invention according to claim 25 is the invention according to any one of claims 14 to 24,
The captured image data is scene reference image data.

The invention according to claim 26 is the invention according to any one of claims 14 to 25,
The image data optimized for viewing on the output medium is viewing image reference data.

The invention according to claim 27 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention according to claim 28 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention according to claim 29 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
It is characterized by having.

The invention according to claim 30 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention according to claim 31 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention according to claim 32 provides
In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
Hue area dividing means for dividing the captured image data into predetermined hue areas;
A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels of the divided hue areas to the entire screen of the captured image data;
Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
It is characterized by having.

The invention according to claim 33 is the invention according to any one of claims 27 to 29,
Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
The HS dividing means divides the captured image data into regions composed of combinations of predetermined hue and saturation based on the created two-dimensional histogram.

The invention according to claim 34 is the invention according to any one of claims 30 to 32,
Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
The HS dividing unit divides the captured image data into regions including combinations of a predetermined hue and saturation based on the created two-dimensional histogram,
The hue area dividing unit divides the captured image data into predetermined hue areas based on the created two-dimensional histogram.

The invention described in claim 35 is the invention described in any one of claims 27-29, 33,
The HS dividing means divides the captured image data into skin color regions having at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128.

The invention according to claim 36 is the invention according to any one of claims 30 to 32, 34,
The HS dividing means divides the captured image data into skin color regions consisting of at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128,
The hue area dividing unit converts the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, a sky hue area of 185 to 224, and a red hue of 225 to 360 in the hue value of the HSV color system. It is characterized by being divided into phase regions.

The invention according to claim 37 is the invention according to any one of claims 27 to 36,
The determination of the gradation processing condition is performed by creating a gradation conversion curve or by adjusting the gradation conversion curve by selecting from a plurality of preset gradation conversion curves. .

The invention according to claim 38 is the invention according to any one of claims 27 to 37,
The captured image data is scene reference image data.

The invention according to claim 39 is the invention according to any one of claims 27 to 38,
The image data optimized for viewing on the output medium is viewing image reference data.

  Here, “captured image data” described in this specification is digital image data in which subject information is held as an electrical signal value. Whatever process is used to obtain digital image data, such as digital image data recorded as color image information on a color photographic film and generated by scanning a scanner or digital image data generated by photographing with a digital camera good.

  However, when digital image data is generated from a color negative film by reading a scanner, the unexposed area (minimum density area) of the color negative film is calibrated and inverted so that the RGB values of the digital image data are all zero. It is desirable to reproduce a state almost proportional to the luminance change of the subject by performing conversion processing from a scale directly proportional to the amount of transmitted light to a logarithmic (density) scale and gamma correction processing of a color negative film. . Similarly, it is desirable that the digital image data captured by the digital camera is in a state that is substantially proportional to the luminance change of the subject. Further, the digital image data is preferably “scene reference image data”.

  "Scene reference image data" refers to standard colors such as RIMM RGB (Reference Input Medium Metric RGB) and ERIMM RGB (Extended Reference Input Medium Metric RGB), which are signal strengths of each color channel based on at least the spectral sensitivity of the image sensor itself. It means image data that has been mapped to a space and in which image processing for modifying the data contents is omitted in order to improve the effect at the time of image viewing such as gradation conversion, sharpness enhancement, and saturation enhancement. The scene reference image data is the photoelectric conversion characteristics of the imaging device (opto-electronic conversion function defined by ISO1452, such as Corona “Fine Imaging and Digital Photography” (published by the Japan Photographic Society Publishing Committee ◆ page 479). It is preferable that the correction is performed. The information amount (for example, the number of gradations) of the standardized scene reference image data conforms to the performance of the A / D converter, and the information amount (for example, the number of gradations) required for the “viewing image reference data” described later. It is preferable that it is equal to or higher. For example, when the number of gradations of the viewing image reference data is 8 bits per channel, the number of gradations of the scene reference image data is preferably 12 bits or more, more preferably 14 bits or more, and even more preferably 16 bits or more.

  “Optimized for viewing on output media” means to obtain optimal images on output devices such as display devices such as CRT, liquid crystal display, plasma display, silver halide photographic paper, inkjet paper, thermal printer paper, etc. For example, when it is assumed that the image is displayed on a CRT display monitor compliant with the sRGB standard, the process is performed so that optimum color reproduction is obtained within the color gamut of the sRGB standard. If output to silver salt photographic paper is assumed, processing is performed so that optimum color reproduction is obtained within the color gamut of silver salt photographic paper. In addition to color gamut compression, gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, and processing for handling output characteristics (LUT) of the output device are also included. Furthermore, it goes without saying that tone compression processing such as noise suppression, sharpening, gray balance adjustment, saturation adjustment, or dodging processing is performed.

"Image data optimized for viewing on output media" means digital image data used to generate images on output media such as CRT, liquid crystal display, plasma display, silver halide photographic paper, inkjet paper, thermal printer paper, etc. In other words, processing is performed so as to obtain an optimal image on a display device such as a CRT, a liquid crystal display, a plasma display, and an output medium such as a silver salt photographic paper, an inkjet paper, and a thermal printer paper. When the above-described “captured image data” is “scene reference image data”, “image data optimized for viewing on an output medium” is referred to as “viewing image reference data”.
The range of the skin color region is determined based on the result of the investigation, and is calculated from the result of examining the range in which the human skin color detection rate is the highest for about 1000 film scan images.

  The hue range in the description of the present invention was calculated from the result of examining the hue range in which the detection rate of human skin color, plant green, and sky color is the highest for about 1000 film scan images. The skin color region range is also determined by conducting the same investigation. In the practice of the present invention, it is desirable to change the numerical limit for each of the film scan image and the digital camera image.

  According to the invention described in claims 1, 14, and 27, the gradation conversion processing condition is determined and determined based on the lightness deviation amount of an area composed of a predetermined combination of hue and saturation in the entire captured image data. Since the tone conversion processing is performed on the captured image data based on the tone conversion processing conditions, it is possible to improve the brightness correction accuracy of a predetermined area in the shooting scene, for example, the face area that is the main subject, and a high-accuracy level. It is possible to perform a tone conversion process.

  According to the second, fifteenth and twenty-eighth aspects of the present invention, the contribution ratio of the area to the gradation conversion process is calculated based on the occupancy ratio of the area composed of a predetermined combination of hue and saturation in the entire captured image data. Then, the gradation conversion processing condition is determined based on the calculated contribution rate, and the captured image data is subjected to the gradation conversion processing based on the determined gradation conversion processing condition. Brightness correction according to the contribution ratio of the face area that is the main subject is possible, and high-precision gradation conversion processing can be performed.

  According to the invention described in claims 3, 16, and 29, the gradation conversion processing condition is determined based on the lightness deviation amount and the contribution ratio of the region composed of a predetermined hue and saturation combination in the entire captured image data. Since the gradation conversion processing is performed on the captured image data based on the determined gradation conversion processing conditions, appropriate lightness correction according to the lightness deviation amount and the contribution rate of a predetermined area, for example, the face area, in the shooting scene is performed. It is possible to perform gradation conversion processing with high accuracy.

  According to the invention described in claims 4, 17, and 30, the gradation conversion processing condition is determined and determined based on the lightness deviation amount of the region composed of a predetermined combination of hue and saturation in the entire captured image data. Since the tone conversion processing is performed on the captured image data based on the tone conversion processing conditions, it is possible to improve the brightness correction accuracy of a predetermined area in the shooting scene, for example, the face area that is the main subject, and a high-accuracy level. It is possible to perform a tone conversion process. Furthermore, by determining the coloration tendency of the shooting scene and changing the low saturation threshold for extracting low saturation pixels based on the determination result, for example, a shooting scene with a large proportion of lawn, sky, etc. In scenes where the empirical rule of low saturation on the highlight and shadow sides in the image does not match, there is a bias in hue due to the extraction of many low saturation pixels used for gray balance adjustment condition calculation from a specific hue As a result, color feria can be suppressed.

  According to the invention described in claims 5, 18 and 31, the contribution ratio of the area to the gradation conversion process is calculated based on the occupation ratio of the area composed of a predetermined combination of hue and saturation in the entire captured image data. Then, the gradation conversion processing condition is determined based on the calculated contribution rate, and the captured image data is subjected to the gradation conversion processing based on the determined gradation conversion processing condition. Brightness correction according to the contribution ratio of the face area that is the main subject is possible, and high-precision gradation conversion processing can be performed. Furthermore, by determining the coloration tendency of the shooting scene and changing the low saturation threshold for extracting low saturation pixels based on the determination result, for example, a shooting scene with a large proportion of lawn, sky, etc. In scenes where the empirical rule of low saturation on the highlight and shadow sides in the image does not match, there is a bias in hue due to the extraction of many low saturation pixels used for gray balance adjustment condition calculation from a specific hue As a result, color feria can be suppressed.

  According to the invention described in claims 6, 19, and 32, the gradation conversion processing condition is determined on the basis of the lightness deviation amount and the contribution rate of the region composed of a predetermined hue and saturation combination in the entire captured image data. Since the gradation conversion processing is performed on the captured image data based on the determined gradation conversion processing conditions, appropriate lightness correction according to the lightness deviation amount and the contribution rate of a predetermined area, for example, the face area, in the shooting scene is performed. It is possible to perform gradation conversion processing with high accuracy. Furthermore, by determining the coloration tendency of the shooting scene and changing the low saturation threshold for extracting low saturation pixels based on the determination result, for example, a shooting scene with a large proportion of lawn, sky, etc. In scenes where the empirical rule that the highlight side or shadow side in the image has low saturation does not match, hue bias due to the extraction of many low saturation pixels used for gray balance adjustment condition calculation from a specific hue As a result, color feria can be suppressed.

  According to invention of Claim 7, 20, 33, in the invention as described in any one of Claims 1-3, the two-dimensional histogram which created the two-dimensional histogram of the hue value and the saturation value was created. Based on the histogram, the captured image data is divided into regions composed of a predetermined combination of hue and saturation, so that the processing can be made more efficient.

  According to invention of Claim 8, 21, 34, in the invention as described in any one of Claims 4-6, the two-dimensional histogram created by creating the two-dimensional histogram of the hue value and the saturation value Based on the histogram, the captured image data is divided into an area composed of a predetermined hue and saturation combination and a predetermined hue area, so that the processing can be made more efficient.

  According to the invention described in claims 9, 22, and 35, it is possible to perform appropriate brightness correction according to the skin color area as the face area in the shooting scene, and it is possible to perform high-precision gradation conversion processing. It becomes.

  According to the tenth, twenty-third, and thirty-sixth aspects of the present invention, it is possible to perform appropriate brightness correction according to the skin color area as the face area in the shooting scene, and to perform highly accurate gradation conversion processing. It becomes. Furthermore, it is possible to perform gray balance adjustment according to the color arrangement tendency of skin color, green color, sky blue color, and red color.

  According to the invention described in claims 11, 24 and 37, the gradation conversion processing condition is determined by creating a gradation conversion curve each time or selecting from a plurality of preset gradation conversion curves. It is possible to do this by either

  According to the invention described in claims 12, 25, and 38, it is possible to form an optimized image on the output medium without causing information loss of the captured image information.

  According to the thirteenth, twenty-sixth, and thirty-ninth aspects of the present invention, it is possible to provide optimal viewing image reference data that is appropriately lightness-corrected and that has no color failure.

[First Embodiment]
Hereinafter, a first embodiment of the present invention will be described in detail with reference to the drawings.
First, the configuration will be described.

  FIG. 1 is a perspective view showing an external configuration of an image recording apparatus 1 according to an embodiment of the present invention. As shown in FIG. 1, the image recording apparatus 1 is provided with a magazine loading unit 3 for loading a photosensitive material on one side of a housing 2. Inside the housing 2 are provided an exposure processing unit 4 for exposing the photosensitive material, and a print creating unit 5 for developing and drying the exposed photosensitive material to create a print. On the other side surface of the housing 2, a tray 6 for discharging the print created by the print creation unit 5 is provided.

  In addition, a CRT (Cathode Ray Tube) 8 serving as a display device, a film scanner unit 9 serving as a device for reading a transparent document, a reflective document input device 10, and an operation unit 11 are provided on the upper portion of the housing 2. The CRT 8 constitutes display means for displaying an image of image information to be printed on the screen. Further, the housing 2 includes an image reading unit 14 that can read image information recorded on various digital recording media, and an image writing unit 15 that can write (output) image signals to various digital recording media. Yes. In addition, a control unit 7 that centrally controls these units is provided inside the housing 2.

  The image reading unit 14 includes a PC card adapter 14a and a floppy (registered trademark) disk adapter 14b, and a PC card 13a and a floppy (registered trademark) disk 13b can be inserted therein. The PC card 13a has, for example, a memory in which a plurality of frame image data captured by a digital camera is recorded. For example, a plurality of frame image data captured by a digital camera is recorded on the floppy (registered trademark) disk 13b. Examples of the recording medium on which frame image data is recorded other than the PC card 13a and the floppy (registered trademark) disk 13b include a multimedia card (registered trademark), a memory stick (registered trademark), MD data, and a CD-ROM. .

  The image writing unit 15 is provided with a floppy (registered trademark) disk adapter 15a, an MO adapter 15b, and an optical disk adapter 15c, into which an FD 16a, an MO 16b, and an optical disk 16c can be respectively inserted. Examples of the optical disc 16c include a CD-R and a DVD-R.

  In FIG. 1, the operation unit 11, the CRT 8, the film scanner unit 9, the reflection original input device 10, and the image reading unit 14 are integrally provided in the housing 2. One or more may be provided separately.

  Further, in the image recording apparatus 1 shown in FIG. 1, there is exemplified an apparatus that creates a print by exposing to a photosensitive material and developing it, but the print creation system is not limited to this, for example, an inkjet system, an electronic system, etc. A method such as a photographic method, a thermal method, or a sublimation method may be used.

<Functional Configuration of Image Recording Apparatus 1>
FIG. 2 is a block diagram showing a functional configuration of the image recording apparatus 1. Hereinafter, the functional configuration of the image output apparatus 1 will be described with reference to FIG.

  The control unit 7 is configured by a microcomputer, and cooperates with various control programs stored in a storage unit (not shown) such as a ROM (Read Only Memory) and a CPU (Central Processing Unit) (not shown). The operation of each part constituting the image recording apparatus 1 is controlled.

  The control unit 7 includes an image processing unit 70 according to the image processing apparatus of the present invention, and is read from the film scanner unit 9 or the reflective original input device 10 based on an input signal (command information) from the operation unit 11. Image processing is performed on the image signal, the image signal read from the image reading unit 14, and the image signal input from the external device via the communication unit 32 to form exposure image information, and the exposure processing unit 4 Output. Further, the image processing unit 70 performs a conversion process corresponding to the output form on the image signal subjected to the image processing, and outputs the image signal. Output destinations of the image processing unit 70 include the CRT 8, the image writing unit 15, the communication means (output) 33, and the like.

  The exposure processing unit 4 exposes an image to the photosensitive material and outputs the photosensitive material to the print creating unit 5. The print creating unit 5 develops and exposes the exposed photosensitive material to create prints P1, P2, and P3. The print P1 is a service size, high-definition size, panoramic size print, the print P2 is an A4 size print, and the print P3 is a business card size print.

  The film scanner unit 9 reads a frame image recorded on a transparent original such as a developed negative film N or a reversal film imaged by an analog camera, and acquires a digital image signal of the frame image. The reflective original input device 10 reads an image on a print P (photo print, document, various printed materials) by a flat bed scanner, and acquires a digital image signal.

  The image reading unit 14 reads frame image information recorded on the PC card 13 a or the floppy (registered trademark) disk 13 b and transfers the frame image information to the control unit 7. The image reading unit 14 includes, as the image transfer means 30, a PC card adapter 14a, a floppy (registered trademark) disk adapter 14b, and the like. The image reading unit 14 reads frame image information recorded on the PC card 13a inserted into the PC card adapter 14a or the floppy (registered trademark) disk 13b inserted into the floppy (registered trademark) disk adapter 14b. Transfer to the control unit 7. For example, a PC card reader or a PC card slot is used as the PC card adapter 14a.

  The communication means (input) 32 receives an image signal representing a captured image and a print command signal from another computer in the facility where the image recording apparatus 1 is installed, or a distant computer via the Internet or the like.

  The image writing unit 15 includes, as the image conveying unit 31, a floppy (registered trademark) disk adapter 15a, an MO adapter 15b, and an optical disk adapter 15c. In accordance with a write signal input from the control unit 7, the image writing unit 15 includes a floppy (registered trademark) disk 16a inserted into the floppy (registered trademark) disk adapter 15a, an MO 16b inserted into the MO adapter 15b, The image signal generated by the image processing method according to the present invention is written to the optical disk 16c inserted into the optical disk adapter 15c.

  The data storage unit 71 stores and sequentially stores image information and order information corresponding to the image information (information on how many prints are to be created from images of which frames, print size information, and the like).

  The template storage means 72 stores at least one template data for setting a synthesis area and a background image, an illustration image, and the like, which are sample image data corresponding to the sample identification information D1, D2, and D3. A predetermined template is selected from a plurality of templates that are set by the operation of the operator and stored in advance in the template storage means 72, and the frame image information is synthesized by the selected template and designated sample identification information D1, D2, D3 The sample image data selected on the basis of the image data and the image data and / or character data based on the order are combined to create a print based on the specified sample. The synthesis using this template is performed by a well-known chroma key method.

  Note that sample identification information D1, D2, and D3 for specifying a print sample is configured to be input from the operation unit 211. These sample identification information is recorded on a print sample or an order sheet. Therefore, it can be read by reading means such as OCR. Or it can also input by an operator's keyboard operation.

  In this way, sample image data is recorded corresponding to the sample identification information D1 for designating the print sample, the sample identification information D1 for designating the print sample is input, and based on the input sample identification information D1. Select the sample image data, synthesize the selected sample image data with the image data and / or text data based on the order, and create a print based on the specified sample. Can actually place a print order and meet the diverse requirements of a wide range of users.

  Also, the first sample identification information D2 designating the first sample and the image data of the first sample are stored, and the second sample identification information D3 designating the second sample and the second sample The image data is stored, the sample image data selected based on the designated first and second sample identification information D2 and D3, and the image data and / or character data based on the order are synthesized, and according to the designation In order to create a print based on a sample, it is possible to synthesize a wider variety of images, and it is possible to create a print that meets a wider variety of user requirements.

  The operation unit 11 includes information input means 12. The information input unit 12 is configured by a touch panel, for example, and outputs a pressing signal of the information input unit 12 to the control unit 7 as an input signal. Note that the operation unit 11 may be configured to include a keyboard, a mouse, and the like. The CRT 8 displays image information and the like according to a display control signal input from the control unit 7.

  The communication means (output) 33 sends an image signal representing a photographed image after image processing of the present invention and order information attached thereto to other computers in the facility where the image recording apparatus 1 is installed, the Internet Etc. to a distant computer via

  As shown in FIG. 2, the image recording apparatus 1 displays an image input unit that captures image information obtained by dividing and metering images of various digital media and an image original, an image processing unit, and a processed image. Print output, image output means for writing to an image recording medium, and means for transmitting image data and accompanying order information to another computer in the facility or a distant computer via the Internet via a communication line, Prepare.

<Internal Configuration of Image Processing Unit 70>
FIG. 3 is a block diagram illustrating a functional configuration of the image processing unit 70. Hereinafter, the image processing unit 70 will be described in detail with reference to FIG.

  As shown in FIG. 3, the image processing unit 70 includes an image adjustment processing unit 701, a film scan data processing unit 702, a reflection original scan data processing unit 703, an image data format decoding processing unit 704, a template processing unit 705, and CRT specific processing. 706, printer specific processing unit A707, printer specific processing unit B708, and image data creation processing unit 709.

  The film scan data processing unit 702 performs a calibration operation specific to the film scanner unit 9, negative / positive reversal (in the case of a negative document), dust scratch removal, contrast adjustment, granular noise removal, and sharpness for the image data input from the film scanner unit 9. Processing such as conversion enhancement is performed, and processed image data is output to the image adjustment processing unit 701. In addition, the film size, the negative / positive type, information relating to the main subject optically or magnetically recorded on the film, information relating to the photographing conditions (for example, information content described in APS), and the like are also output to the image adjustment processing unit 701. .

  The reflection original scan data processing unit 703 performs a calibration operation unique to the reflection original input device 10, negative / positive reversal (in the case of a negative original), dust flaw removal, contrast adjustment, and noise removal on the image data input from the reflection original input device 10. Then, processing such as sharpening enhancement is performed, and processed image data is output to the image adjustment processing unit 701.

  The image data format decoding processing unit 704 restores the compression code, if necessary, according to the data format of the image data input from the image transfer means 30 and / or the communication means (input) 32, and the color data. Are converted into a data format suitable for computation in the image processing unit 70 and output to the image adjustment processing unit 701. Further, when the size of the output image is specified from any of the operation unit 11, the communication unit (input) 32, and the image transfer unit 30, the image data format decoding processing unit 704 detects the specified information, The image is output to the image adjustment processing unit 701. Information about the size of the output image designated by the image transfer means 30 is embedded in the header information and tag information of the image data acquired by the image transfer means 30.

  The image adjustment processing unit 701 receives from the film scanner unit 9, the reflection original input device 10, the image transfer unit 30, the communication unit (input) 32, and the template processing unit 705 based on the command from the operation unit 11 or the control unit 7. The image data is subjected to optimization processing including gradation conversion processing A, which will be described later, to generate digital image data for output optimized for viewing on the output medium. The CRT specific processing unit 706, printer The data is output to the unique processing unit A 707, the printer unique processing unit B 708, the image data creation processing unit 709, and the data storage unit 71.

  In the optimization process, for example, when it is assumed that the image is displayed on a CRT display monitor compliant with the sRGB standard, the optimal color reproduction is performed within the sRGB standard color gamut. If output to silver salt photographic paper is assumed, processing is performed so that optimum color reproduction is obtained within the color gamut of silver salt photographic paper. In addition to the compression of the color gamut, gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, and processing for handling output characteristics (LUT) of the output device are also included. Furthermore, it goes without saying that tone compression processing such as noise suppression, sharpening, saturation adjustment, or dodging processing is performed.

  The template processing unit 705 reads out predetermined image data (template) from the template storage unit 72 based on a command from the image adjustment processing unit 701, and performs template processing for combining the image data to be processed with the template. The image data after the template processing is output to the image adjustment processing unit 701.

  The CRT specific processing unit 706 performs a process such as changing the number of pixels and color matching on the image data input from the image adjustment processing unit 701 as necessary, and combines the information with information that needs to be displayed, such as control information. Image data is output to the CRT 8.

  The printer-specific processing unit A707 performs printer-specific calibration processing, color matching, pixel number change processing, and the like as necessary, and outputs processed image data to the exposure processing unit 4.

  When an external printer 51 such as a large-format ink jet printer can be connected to the image recording apparatus 1 of the present invention, a printer specific processing unit B708 is provided for each printer apparatus to be connected. The printer-specific processing unit B708 performs printer-specific calibration processing, color matching, pixel number change, and the like, and outputs processed image data to the external printer 51.

  The image data format creation processing unit 709 converts the image data input from the image adjustment processing unit 701 into various general-purpose image formats typified by JPEG, TIFF, Exif, and the like as necessary. The completed image data is output to the image transport unit 31 and the communication means (output) 33.

  3, the film scan data processing unit 702, the reflection original scan data processing unit 703, the image data format decoding processing unit 704, the image adjustment processing unit 701, the CRT specific processing unit 706, the printer specific processing unit A707, the printer The divisions of the unique processing unit B708 and the image data creation processing unit 709 are provided to assist understanding of the functions of the image processing unit 70, and are not necessarily realized as physically independent devices. Alternatively, it may be realized as a type of software processing performed by a single CPU.

Next, the operation of the present invention will be described.
FIG. 4 is a flowchart showing the gradation conversion process A executed by the image adjustment processing unit 701. This processing is realized by software processing in cooperation with the gradation conversion processing A program stored in a storage unit (not shown) such as a ROM and the CPU, and includes a film scan data processing unit 702, reflection The process starts when image data (image signal) is input from the document scan data processing unit 703 or the image data format decoding processing unit 704. By executing this gradation conversion processing A, the data acquisition means, HS division means, brightness deviation amount calculation means, gradation conversion according to claims 14, 16, 17, 19, 27, 29, 30, 32 of the present invention A processing condition determining unit, a gradation conversion processing unit, and a two-dimensional histogram creating unit according to claims 20 and 33 are realized.
Hereinafter, the gradation conversion processing A will be described with reference to FIG.

  When image data is input from the film scan data processing unit 702, the reflection original scan data processing unit 703, or the image data format decoding processing unit 704, the input image data is converted into L * a * b * or HSV from the RGB color system. Are converted into the color system of the above, the hue value, saturation value, and brightness value for each pixel of the input image data are acquired and stored in a RAM (not shown) (step S1).

Hereinafter, specific examples of calculation formulas for acquiring the hue value, the saturation value, and the brightness value from the RGB values of each pixel of the input image data will be shown.
First, an example of obtaining a hue value, a saturation value, and a lightness value by converting from RGB to the HSV color system will be described in detail with reference to [Expression 1]. This conversion program is hereinafter referred to as an HSV conversion program. The HSV color system is devised based on the color system proposed by Munsell, which expresses colors with three elements of hue, saturation, and value (brightness).

The values of digital image data that is input image data are defined as InR, InG, and InB. The calculated hue value is defined as OutH, the scale is defined as 0 to 360, the saturation value is defined as OutS, the lightness value is defined as OutV, and the unit is defined as 0 to 255.

  Any color system such as L * a * b *, L * u * v *, Hunter L * a * b *, YCC, YUV, YIQ may be used in addition to HSV. It is desirable to use an HSV that can directly obtain a hue value, a saturation value, and a brightness value.

As a reference example using a color system other than HSV, an example using L * a * b * will be described below.
The L * a * b * color system (CIE1976) is one of the uniform color spaces established by the CIE (International Commission on Illumination) in 1976, and is defined by IEC61966-2-1 [Equation 2 The L * a * b * value is obtained from the RGB value by applying the following [Equation 3] defined by JISZ8729. From the obtained a * b *, the hue value (H ′) and the saturation value (S ′) are obtained by the following [Equation 4]. However, the hue (H ′) and saturation (S ′) obtained here are different from the hue value (H) and saturation value (S) of the HSV color system described above.

The above [Equation 2] is converted from 8- bit input image data (R sRGB (8) , G sRGB (8) , B sRGB (8) ) to tristimulus values (X, Y, Z) of color matching functions. Is shown. Here, the color matching function is a function indicating the spectral sensitivity distribution of the human eye. Here, sRGB of input image data (R sRGB (8) , G sRGB (8) , B sRGB (8) ) in [Equation 2] indicates that the RGB value of the input image data conforms to the sRGB standard, (8) indicates 8-bit (0 to 255) image data.

Further, the above [Expression 3] indicates that the tristimulus values (X, Y, Z) are converted into L * a * b *. Xn [Expression 3], Yn, Zn are X each standard white plate, Y, indicates Z, D 65 represents a stimulus value when the color temperature is illuminated with light of 6500K to standard white plate. In [Formula 3], Xn = 0.95, Yn = 1.00, and Zn = 1.09.

  When the hue value and saturation value of each pixel of the input image data are acquired, the cumulative frequency distribution of the pixels in the coordinate plane with the hue value (H) as the X axis and the saturation value (S) as the Y axis is obtained. A two-dimensional histogram is generated (step S2).

  FIG. 5 shows an example of a two-dimensional histogram. The two-dimensional histogram shown in FIG. 5 represents a grid point having a value of the cumulative frequency distribution of pixels in a coordinate plane having a hue value (H) on the X axis and a saturation value (S) on the Y axis. is there. The grid points at the edge of the coordinate plane hold the cumulative frequency of the number of pixels distributed in the range where the hue value (H) is 18 and the saturation value (S) is about 13. The remaining grid points hold the cumulative frequency of the number of pixels distributed in the range where the hue value (H) is 36 and the saturation value (S) is about 25. For example, the area A represents a green hue area having a hue value (H) of 70 to 184.

  Next, based on the created two-dimensional histogram, the input image data is divided into regions composed of a predetermined combination of hue and saturation (step S3). Specifically, the generated two-dimensional histogram is divided into at least four planes with at least one hue value and one saturation value defined in advance as a boundary, so that the input image data has a predetermined hue. And a region composed of combinations of saturations. In the present invention, it is desirable to divide into four planes by at least one hue value and one saturation value. The hue value used as the boundary is preferably defined as 70 calculated by the HSV conversion program. Further, it is desirable to define the saturation value as the boundary as 128 calculated by the HSV conversion program. Also in the present embodiment, the two-dimensional histogram (input image data) is divided by 70 hue values and 128 saturation values. As a result, it is possible to divide the two-dimensional histogram (input image data) into at least a well-established skin color region (hue values 0 to 69 and saturation values 0 to 128) including the skin color.

  The range of the skin color region is determined based on the result of the investigation, and is calculated from the result of examining the range in which the human skin color detection rate is the highest for about 1000 film scan images.

When the input image data is divided into regions composed of a predetermined combination of hue and saturation, a lightness deviation amount for the entire image of the predetermined divided region, that is, the aforementioned skin color region is calculated (step S4). The lightness deviation amount is calculated as follows.
First, the average brightness value (A) of the skin color area is calculated from the sum of the sigma value of the cumulative frequency distribution distributed in the skin color area and the brightness value of all the pixels in the skin color area. Next, the maximum brightness (= B) and the minimum brightness (= C) of the entire image are acquired. Then, the lightness deviation amount (= D) for the entire skin color region image is calculated from the following [Equation 1].
[Formula 1]
D = (AB) / (CB)

  When the value of the above-mentioned lightness deviation is greater than 0.5, the average brightness value of the skin color area is lower than the average brightness value of the entire image, and when the value is less than 0.5, the average of the skin color area is compared with the average brightness value of the entire image. The brightness value is high. When 0.5, the average brightness value of the skin color area is equal to the average brightness value of the entire image.

  When the brightness deviation amount is calculated, gradation conversion processing conditions are determined based on the calculated brightness deviation amount (step S5).

  Here, the gradation conversion process in the present invention is intended to be a process of adjusting the brightness of a specific area in the photographic scene, for example, the face area to be corrected to an appropriate value. In general, the average brightness value of the entire image is generally used as an index for determining a target value after gradation conversion, which is necessary when performing gradation conversion processing. However, in backlit scenes or flash shooting scenes, extremely bright areas and dark areas are mixed in the image, and the brightness of the face area, which is an important subject, is either bright or dark. It has the characteristic of being biased. Therefore, it is desired to adjust according to the brightness deviation amount of the face area.

  Various methods are known as face area extraction methods. In the present invention, the above-described skin color area (the hue value is 0 to 69 and the saturation value is 0 to 128 as calculated by the HSV conversion program) is used. Extracted as a face area. The brightness deviation amount obtained in step S4 is the brightness deviation amount of the face area with respect to the entire image.

  In addition to the above method for extracting the skin color area, it is desirable to separately perform image processing for extracting the face area to improve the extraction accuracy. Any known process may be used for the image process for extracting the face area. Examples of the image processing for extracting the known face area include a simple area expansion method. In the simple area expansion method, when pixels (skin color pixels) that meet the skin color definition are discretely extracted, the difference between the surrounding pixels and the discretely extracted skin color pixels is obtained, and the difference is predetermined. If it is smaller than the threshold value, it is determined as a skin color pixel, and the face area is gradually expanded to enable extraction as a face area. It is also possible to extract a face region from a skin color region using a learning function based on a neural network.

To determine the gradation conversion processing condition based on the calculated brightness deviation amount means to determine a gradation conversion curve to be applied to the input image data based on the calculated brightness deviation amount.
The gradation conversion curve may be determined by changing the gradation conversion curve each time based on the calculated lightness deviation amount. Alternatively, a plurality of gradation conversion curves are prepared in advance, and the lightness deviation amount is prepared. The gradation conversion curve may be selected and applied according to the above.

  FIG. 6 illustrates an example in which a gradation conversion curve is selected and applied according to the lightness deviation amount. In FIG. 6, the gradation conversion curve L1 has a lightness deviation amount (= D) of 0.65 ≦ D <0.75, the gradation conversion curve L2 has a lightness deviation amount of 0.55 ≦ D <0.65, and the gradation conversion curve L3 has a lightness value. The deviation amount is 0.45 ≦ D <0.55, the gradation conversion curve L4 is determined when the brightness deviation amount is 0.35 ≦ D <0.45, and the gradation conversion curve L5 is determined when the brightness deviation amount is 0.25 ≦ D <0.35. is there.

  When the tone conversion curve is adjusted, the tone conversion curve is applied to the input image data, the tone conversion process is performed (step S6), and this process ends.

  As described above, according to the image recording apparatus 1, the hue value and the saturation value of the input image data are acquired, and the coordinates where the x-axis is the hue value (H) and the y-axis is the saturation value (S). A two-dimensional histogram showing the cumulative frequency distribution of pixels is created in a plane, and the two-dimensional histogram is divided into regions each having a predetermined combination of hue and saturation, and a predetermined divided region, that is, a skin color region is used as a face region. Calculating a brightness deviation amount for the entire image of the face region, determining a gradation conversion curve based on the calculated brightness deviation amount, and applying the determined gradation conversion curve to the input image data to convert the gradation Apply processing. Therefore, it is possible to perform high-precision gradation conversion processing with high flesh color brightness correction accuracy. In particular, as in backlit scenes and close-up flash photography, extremely bright areas and dark areas are mixed in the image, and the brightness of the face area, which is an important subject, is either bright or dark. Effective when biased.

[Second Embodiment]
Next, a second embodiment of the present invention will be described.
The configuration of the image recording apparatus 1 according to the present embodiment is the same as that shown in FIG.

The operation of the second embodiment will be described below.
FIG. 7 is a flowchart showing the gradation conversion process B executed by the image adjustment processing unit 701. This processing is realized by software processing in cooperation with the gradation conversion processing B program stored in a storage unit (not shown) such as a ROM and the CPU, and includes a film scan data processing unit 702, reflection The process starts when image data (image signal) is input from the document scan data processing unit 703 or the image data format decoding processing unit 704. By executing this gradation conversion processing B, the data acquisition means, HS division means, HS occupancy rate calculation means, contribution rate calculation according to claims 15, 16, 18, 19, 28, 29, 31, 32 of the present invention Means, gradation conversion processing condition determination means, gradation conversion processing means, and two-dimensional histogram creation means according to claims 20 and 33.

Hereinafter, the gradation conversion process B will be described with reference to FIG.
When image data is input from the film scan data processing unit 702, the reflection original scan data processing unit 703, or the image data format decoding processing unit 704, the input image data is converted into L * a * b * or HSV from the RGB color system. Are converted into the color system, the hue value and the saturation value for each pixel of the input image data are calculated and stored in a RAM (not shown) (step S11). As calculation formulas for calculating the hue value and the saturation value from the RGB values of each pixel, for example, [Program Example], [Equation 2] to [Equation 4] described in the first embodiment are used. Use.

  When the hue value and saturation value of each pixel of the image data are calculated, the cumulative frequency distribution of the pixels is shown in the coordinate plane with the hue value (H) as the X axis and the saturation value (S) as the Y axis. A two-dimensional histogram is created (step S12). The two-dimensional histogram created here is, for example, the same histogram as described in FIG.

  Next, based on the created two-dimensional histogram, the input image data is divided into regions each having a predetermined combination of hue and saturation (step S13). Specifically, the generated two-dimensional histogram is divided into at least four planes with at least one hue value and one saturation value defined as boundaries as boundaries, so that the input image data has a predetermined hue. And a region composed of combinations of saturations. In the present invention, it is desirable to divide into four planes by at least one hue value and one saturation value. The hue value used as the boundary is preferably defined as 70 calculated by the HSV conversion program. Further, it is desirable to define the saturation value as the boundary as 128 calculated by the HSV conversion program. Also in the present embodiment, the two-dimensional histogram (input image data) is divided by 70 hue values and 128 saturation values. As a result, it is possible to divide the two-dimensional histogram (input image data) into at least a well-established skin color region (hue values 0 to 69 and saturation values 0 to 128) including the skin color.

  The range of the skin color region is determined based on the result of the investigation, and is calculated from the result of examining the range in which the human skin color detection rate is the highest for about 1000 film scan images.

  When the input image data is divided into regions composed of a predetermined combination of hue and saturation, the sigma value of the cumulative frequency distribution in the predetermined divided region, that is, the aforementioned skin color region, is divided by the total number of pixels of the input image data. By doing this, the ratio of the pixels of the skin color area to the entire screen of the input image data, that is, the occupation ratio of the skin color area is calculated (step S14).

When the occupation ratio of the predetermined divided area is calculated, the contribution ratio of the area, that is, the skin color area, to the gradation conversion process is calculated based on the calculated occupation ratio (step S15). The contribution ratio (Rsk) of the skin color area is obtained by the following [Equation 2] based on the occupation ratio (e) of the skin color area.
[Formula 2]
Rsk = e

  Next, tone conversion processing conditions are determined based on the calculated contribution rate (step S16).

  Here, the gradation conversion processing referred to in the present invention is intended to be processing for adjusting a brightness of a specific area in a shooting scene, for example, a face area as an important subject to be corrected to an appropriate value. However, the importance as the main subject naturally differs between the case where the face area is small and the case where the face area is large. Therefore, when the face area is small, correction is performed so that subjects other than the face area (for example, landscape) have appropriate brightness. Is done. Therefore, when the occupation ratio of the face area, that is, the contribution ratio of the face area is high, the average brightness input value, which is an index for determining the target value after the gradation conversion, necessary for the gradation conversion processing, is the face When only the average brightness value of the area is calculated and the contribution ratio of the face area is low, the average brightness input value is calculated for other than the face area. The average brightness input value is calculated according to the rate, and the gradation conversion condition is determined based on the calculated average brightness input value.

  Various methods are known as face area extraction methods. In the present invention, the above-described skin color area (the hue value is 0 to 69 and the saturation value is 0 to 128 as calculated by the HSV conversion program) is used. Extracted as a face area.

  In addition to the above method for extracting the skin color area, it is desirable to separately perform image processing for extracting the face area to improve the extraction accuracy. Any known process may be used for the image process for extracting the face area. Examples of the image processing for extracting the known face area include a simple area expansion method. In the simple area expansion method, when pixels (skin color pixels) that meet the skin color definition are discretely extracted, the difference between the surrounding pixels and the discretely extracted skin color pixels is obtained, and the difference is predetermined. If it is smaller than the threshold value, it is determined as a skin color pixel, and the face area is gradually expanded to enable extraction as a face area. It is also possible to extract a face region from a skin color region using a learning function based on a neural network.

The gradation conversion processing condition is determined by determining a gradation conversion curve to be applied to the input image data based on the calculated skin color area, that is, the contribution ratio of the face area. Specifically, the average brightness value of the face area and the average brightness value of the entire image are calculated, and based on the contribution ratio of the face area, the average brightness value of the face area, and the average brightness value of the entire image calculated in step S15. The average brightness input value in the tone conversion process is calculated, and the tone conversion curve is determined so that the average brightness input value is converted into a preset average brightness value conversion target value as shown in FIG. Is done. The average brightness input value (c) is obtained by the following [Equation 3], where a is the average brightness value of the entire image, b is the average brightness value of the face area, and Rsk is the contribution ratio of the face area.
[Formula 3]
c = a × (1-(Rsk × 0.01)) + (b × Rsk × 0.01)

  In FIG. 8, C1 to C5 indicate average brightness input values. The gradation conversion curve is adjusted and determined so that the average brightness input value is converted into a predetermined average brightness value conversion target value. The gradation conversion curve may be determined by changing the gradation conversion curve each time based on the calculated average brightness input value. Alternatively, a plurality of gradation conversion curves are prepared in advance, and the average brightness is determined. The gradation conversion curve may be selected and applied according to the input value.

  When the gradation conversion curve is adjusted, the gradation conversion curve is applied to the input image data, the gradation conversion process is performed (step S17), and this process ends.

  As described above, according to the image recording apparatus 1, the hue value and the saturation value of the input image data are acquired, and the coordinates where the x-axis is the hue value (H) and the y-axis is the saturation value (S). A two-dimensional histogram showing the cumulative frequency distribution of pixels is created in a plane, and the two-dimensional histogram is divided into regions each having a predetermined combination of hue and saturation, and a predetermined divided region, that is, a skin color region is used as a face region. The occupancy ratio of the face area is calculated, the contribution ratio of the face area to the gradation conversion process is calculated based on this, the gradation conversion curve is determined based on the calculated contribution ratio, and the determined gradation The conversion curve is applied to the input image data to perform gradation conversion processing. Therefore, it is possible to perform high-precision gradation conversion processing with high brightness correction accuracy according to the importance of the face area in the shooting scene.

  Note that the gradation conversion processing A and the gradation conversion processing B described in the first and second embodiments are used in combination to further improve the brightness correction accuracy of the specific region, that is, the face region. Can do. Specifically, steps S1 to S4 of the gradation conversion process A are executed to divide the input image data into areas composed of a predetermined combination of hue and saturation, and the entire image of the predetermined divided area, that is, the face area. And calculating a contribution ratio of the predetermined divided area, that is, the face area, to the gradation conversion process, and based on the calculated brightness deviation amount and the contribution ratio, A gradation conversion curve is determined. Thereby, it is possible to further improve the brightness correction accuracy of the face area.

  Further, in addition to performing the gradation conversion process on the input image data by any one of the gradation conversion processes A and B, or a combination thereof, gray balance adjustment is performed in consideration of the color arrangement tendency of the shooting scene. It is preferable to apply.

  FIG. 9 is a flowchart showing gray balance adjustment processing executed by the image adjustment processing unit 701. This processing is realized by software processing in cooperation with a gray balance adjustment processing program stored in a storage unit (not shown) such as a ROM and a CPU, and includes a film scan data processing unit 702, a reflective original. The process starts when image data (image signal) is input from the scan data processing unit 703 or the image data format decoding processing unit 704. By executing this gray balance adjustment processing, the hue area dividing means, the hue area occupancy calculating means, the low saturation threshold calculating means, the low saturation pixel extracting means, according to claims 17 to 19 and 30 to 32 of the present invention, A gray balance adjustment condition calculation unit, a gray balance adjustment unit, and a two-dimensional histogram creation unit according to claims 21 and 34 are realized.

  Hereinafter, the gray balance adjustment process will be described with reference to FIG. In the present embodiment, the gray balance adjustment process is performed after the gradation conversion process described above. Therefore, it will be described that the acquisition of the hue value and the saturation value for each pixel of the input image data and the creation of the two-dimensional histogram composed of the hue value and the saturation value have already been performed.

  First, the input image data is divided into predetermined hue regions based on a two-dimensional histogram composed of hue values and saturation values (step S21). Specifically, the created two-dimensional histogram is divided into two or more planes with at least one predefined hue value as a boundary, whereby the input image data is divided into predetermined hue regions. . In the present invention, it is desirable to divide input image data into four planes based on at least three hue values. Further, it is desirable to define the hue value as the boundary as 70, 185, 225 as calculated values by the above-mentioned HSV conversion program. The boundary value for dividing the hue region was calculated from the result of examining the hue range in which the detection rate of human skin color, plant green, and sky color is highest for about 1000 film scan images. By defining in this way, as shown in FIG. 10, a two-dimensional histogram (input image data) is converted into a skin hue area (hue values 0 to 69), a green hue area (hue values 70 to 184), and a sky hue area ( It is possible to divide into four hue regions, hue values 185 to 224) and red hue regions (hue values 225 to 360). Also in the present embodiment, the two-dimensional histogram (input image data) is divided into four hue regions with hue values of 70, 185, and 225.

  Next, by dividing the sigma value of the cumulative frequency distribution in each area by the total number of pixels of the input image data for each of the hue areas divided in step S21, the pixels for each hue area are displayed on the entire screen of the input image data. Is calculated, that is, the occupation ratio for each hue region (step S22).

For example, as input image data, the total number of pixels is 5 million pixels, and the shooting scene is a relatively large image of a woman wearing a red cardigan sitting on the lawn (referred to as image data α). When steps S21 to S22 described above are executed, the occupation ratio becomes a value shown in [Table 1] below.

  Next, a low saturation threshold value for each hue area is calculated according to the occupation ratio for each hue area obtained in step S22 (step S23). In this step, a saturation threshold for extracting low saturation pixels used in a gray balance adjustment condition calculation process described later (hereinafter referred to as “low saturation threshold value”) according to the occupation ratio for each hue region. Is set for each hue region.

A relational expression between the occupation ratio (= RC) of each hue region and the low saturation threshold (= LC) is illustrated in [Expression 3] below. Note that the relational expression between the occupation ratio (= RC) and the low saturation threshold (= LC) of each hue region is not limited to this.
[Formula 3]
LC (S) = 30 × ((RC (%) / 100) 0.25 × 30)

When the low saturation threshold value for each hue region of the image data α described above is obtained in step S23, the following [Table 2] results are obtained.
FIG. 10 is a diagram showing, as a boundary line B, the low saturation threshold obtained for the image data α described above on the two-dimensional histogram of the hue value and the saturation value.

  Next, for each hue region, pixels used for gray balance adjustment are extracted based on the low saturation threshold calculated in step S23 (step S24). That is, a low saturation pixel to be used in a gray balance adjustment condition calculation process to be described later is determined using a low saturation threshold value calculated for each hue region, and is configured only with low saturation pixels (excluding the pixel itself, Intermediate image data that is masked with a value of 0 or 255) is generated.

  By executing Steps S21 to S24 described above, it is possible to prevent a hue bias in gray balance adjustment due to extraction of many low-saturation pixels from a region having a high ratio in the image, such as lawn or sky.

  In this process, not only the “low saturation threshold value” but also a relational expression between the occupancy rate of each hue region and the extraction rate of low saturation pixels is defined in advance, and the occupancy rate for each hue region is determined. Based on the low saturation threshold and the low saturation pixel extraction rate, the low saturation pixel extraction rate may be calculated based on the “low saturation pixel extraction rate”.

  Next, the gray balance adjustment condition calculation process is executed with the pixel extracted in step S24 as the target pixel (step S25).

  FIG. 11 is a flowchart showing the gray balance adjustment condition calculation process executed by the image adjustment processing unit 701 in step S25. Hereinafter, the gray balance adjustment condition calculation processing will be described with reference to the flowchart of FIG. 11 and FIGS. 12 to 14.

  First, a cumulative histogram of each channel of R, G, and B is created for the target pixel for gray balance adjustment condition calculation (step S31). That is, the cumulative histogram shown in FIG. 12 is created for each of the R, G, and B channels.

  Next, as shown in FIG. 12, in the cumulative histogram of each channel of R, G, and B, the cumulative frequency of pixels from the highlight (255 for 8 bits) side to the shadow (0) side is the total low saturation pixel number. The average value of the accumulated pixels until the predetermined percentage is reached is calculated, and the average value of the accumulated pixels is determined as the highlight point (Hp) of each channel (step S32). Also, as shown in FIG. 12, in the cumulative histogram of each of the R, G, and B channels, the cumulative frequency of pixels from the shadow (0) side to the highlight (255 for 8 bits) is the total low saturation pixel number. The average value of the accumulated pixels until reaching a predetermined percentage of is calculated, and the average value of the accumulated pixels is determined as the shadow point (Sp) of each channel (step S33).

  Next, END (equivalent neutral density) conversion (equivalent neutral density conversion) pixels are extracted based on the determined highlight point Hp and shadow point Sp of each channel (step S34). Specifically, BG and RG are calculated for all the target pixels, and a chromaticity diagram with the BG on the X axis and the RG on the Y axis is created, and a highlight point is displayed on the chromaticity diagram. After the Hp and the shadow point Sp are plotted (see FIG. 13, the point A is Hp and the point B is Sp), the pixels distributed within a predetermined distance from the straight line connecting the two (pixels in the region Q in FIG. 13) Are extracted as END conversion pixels. In this step, on the premise of an empirical rule that the highlight point Hp and the shadow point Sp have low saturation, low saturation pixels distributed in the vicinity on the chromaticity diagram are extracted as END conversion pixels. However, in the present embodiment, low saturation pixels extracted based on the occupation ratio of each hue in the entire image are used as target pixels. Even when the above empirical rule does not agree, such as a high shooting scene, it is possible to set an adjustment condition capable of performing gray balance adjustment without hue deviation.

  When the END conversion pixels are extracted, as shown in FIG. 14, a correlation diagram of x axis = G value, y axis = R value, and a correlation diagram of x axis = G value, y axis = B value are created. The extracted END conversion pixel values are plotted on each correlation diagram (step S35), and the least square approximation line l is set (step S36). Then, an END conversion formula for correcting the set least square approximation straight line l to the slope = 1 and the intercept = 0 is calculated (step S37), and the process proceeds to step S26 in FIG.

  In FIG. 9, when the END conversion formula as the gray balance adjustment condition is calculated, the calculated END conversion formula is applied to the image data after the gradation conversion processing, thereby performing gray balance adjustment (step S26). ), This process ends.

  As described above, according to the image recording apparatus 1, after the gradation conversion by any of the above-described gradation conversion processes A and B or a combination thereof, the two-dimensional histogram is divided into predetermined hue regions and divided. For each hue area, the ratio (occupation ratio) of pixels in each area to the entire input image data is calculated, and a low saturation threshold value for each hue area is calculated based on the calculated occupation ratio. Based on the above, a low saturation pixel used for gray balance adjustment condition calculation is extracted. Then, a gray balance adjustment condition is calculated using the extracted low-saturation pixels, and gray balance adjustment is performed on the image data after the gradation conversion process under the calculated condition.

  Therefore, in addition to the effects of the above-described gradation conversion processing A, B or a combination thereof, the coloration tendency of the shooting scene is determined, and the threshold setting for extracting low saturation pixels is changed based on the determination result. In addition, since it is possible to prevent hue bias due to the extraction of a large number of low-saturation pixels from areas with a high occupation rate such as lawn and sky, an effect that color feria is suppressed can be obtained. Can be provided.

  Although the first and second embodiments have been described above, the image data used in these embodiments is preferably scene reference image data in the case of an image taken with a digital camera. In the scene reference image data, the signal intensity of each color channel based on at least the spectral sensitivity of the image sensor itself has been mapped to a standard color space such as RIMM RGB or ERIMM RGB, and tone conversion, sharpness enhancement, saturation enhancement, etc. This means image data in a state where image processing for modifying the data content is omitted in order to improve the effect at the time of image viewing. Accordingly, the scene reference image data is input to the image recording apparatus 1, and the image adjustment processing unit 701 uses the output destination information input from the operation unit 11 on the output medium (CRT, liquid crystal display, plasma display, silver display). (Salt printing paper, ink jet paper, thermal printer paper, etc.) are converted into appreciation image reference data by performing the optimization processing including the gradation conversion processing and gray balance adjustment described above so that an optimal appreciation image can be obtained. Thus, it is possible to form optimized viewing image reference data on the output medium without causing information loss of the captured image information.

In addition, the description content in each said embodiment is a suitable example of this invention, and is not limited to this.
For example, in the above embodiment, the image recording apparatus having a function of performing image processing on input image data and recording on the output medium has been described as an example. However, the present invention performs image processing on input image data. Needless to say, the present invention can also be applied to an image processing apparatus that outputs an image to an image recording apparatus.

  Further, in each of the above-described embodiments, the captured image data is divided into a region including a hue region, a combination of a hue value, and a saturation value using a two-dimensional histogram, but without using a two-dimensional histogram, The captured image data may be divided based on the hue value and saturation value of each pixel acquired from the captured image data. By using a two-dimensional histogram, the processing can be made efficient.

  In addition, the boundary value when dividing the captured image data and the range of the hue value and the saturation value of the flesh-colored area are the results of investigation of about 1000 frames of film scan images. It is desirable to change the numerical limits for each image.

  In addition, the detailed configuration and detailed operation of the image recording apparatus 1 can be changed as appropriate without departing from the spirit of the present invention.

1 is a perspective view showing an external configuration of an image recording apparatus 1 according to the present invention. FIG. 2 is a block diagram illustrating an internal configuration of the image recording apparatus 1 in FIG. 1. It is a block diagram which shows the functional structure of the image process part 70 of FIG. 4 is a flowchart showing a gradation conversion process A executed by an image adjustment processing unit 701 in FIG. 3. It is a figure which shows an example of a two-dimensional histogram. It is a figure which shows an example of the gradation conversion curve determined in the 1st Embodiment of this invention. 4 is a flowchart illustrating a gradation conversion process B executed by an image adjustment processing unit 701 in FIG. 3. It is a figure which shows an example of the gradation conversion curve determined in the 2nd Embodiment of this invention. 4 is a flowchart showing gray balance adjustment processing executed by an image adjustment processing unit 701 in FIG. 3. It is the figure which showed the low saturation threshold calculated | required about image data (alpha) as the boundary line B on the two-dimensional histogram of a hue value and a saturation value. 4 is a flowchart showing gray balance adjustment condition calculation processing executed by an image adjustment processing unit 701 in FIG. 3. It is a figure which shows an example of the accumulation histogram created by step S31 of FIG. It is a figure which shows an example of the chromaticity diagram produced by step S34 of FIG. It is a figure which shows an example of the G, R correlation diagram, G, Y correlation diagram created by step S35 of FIG.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Image recording apparatus 2 Case 3 Magazine loading part 4 Exposure processing part 5 Print preparation part 7 Control part 8 CRT
DESCRIPTION OF SYMBOLS 9 Film scanner part 10 Reflected original input device 11 Operation part 12 Information input means 14 Image reading part 15 Image writing part 30 Image transfer means 31 Image conveying part 32 Communication means (input)
33 Communication means (output)
51 External Printer 70 Image Processing Unit 701 Image Adjustment Processing Unit 702 Film Scan Data Processing Unit 703 Reflected Original Scan Data Processing Unit 704 Image Data Format Decoding Processing Unit 705 Template Processing Unit 706 CRT Specific Processing Unit 707 Print Specific Processing Unit A
708 Print unique processing section B
709 Image data creation processing unit 71 Data storage unit 72 Template storage unit

Claims (39)

  1. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Determining a gradation conversion processing condition based on the calculated brightness deviation amount;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    An image processing method comprising:
  2. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
    Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    An image processing method comprising:
  3. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
    Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
    Determining a gradation conversion processing condition based on the calculated brightness deviation amount and contribution rate;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    An image processing method comprising:
  4. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Determining a gradation conversion processing condition based on the calculated brightness deviation amount;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    Dividing the captured image data into predetermined hue regions;
    Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
    Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
    Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    Calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
    An image processing method comprising:
  5. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
    Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    Dividing the captured image data into predetermined hue regions;
    Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
    Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
    Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    Calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
    An image processing method comprising:
  6. In an image processing method for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Obtaining a hue value, a saturation value, and a brightness value for each pixel of the captured image data;
    Dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    Calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Calculating an occupancy ratio indicating a ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data; and
    Based on the calculated occupancy rate, calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation,
    Determining a gradation conversion processing condition based on the calculated brightness deviation amount and contribution rate;
    Applying gradation conversion processing to the captured image data based on the determined gradation conversion processing conditions;
    Dividing the captured image data into predetermined hue regions;
    Calculating an occupancy ratio indicating a ratio of pixels for each of the divided hue regions to the entire screen of the captured image data;
    Calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
    Extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    Calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Applying gray balance adjustment to the captured image data based on the calculated gray balance adjustment condition;
    An image processing method comprising:
  7. Creating a two-dimensional histogram of the acquired hue and saturation values;
    The step of dividing the captured image data into regions composed of a predetermined hue and saturation combination is based on the created two-dimensional histogram, and the captured image data is divided into regions composed of a predetermined hue and saturation combination. The image processing method according to claim 1, wherein the image processing method is divided.
  8. Creating a two-dimensional histogram of the acquired hue and saturation values;
    The step of dividing the captured image data into a region composed of a predetermined hue and saturation combination is based on the created two-dimensional histogram, and the captured image data is divided into a region composed of a predetermined hue and saturation combination. Split and
    7. The step of dividing the captured image data into predetermined hue regions divides the captured image data into predetermined hue regions based on the created two-dimensional histogram. An image processing method according to claim 1.
  9.   The step of dividing the captured image data into regions composed of a predetermined combination of hue and saturation comprises at least 0 to 69 in the HSV color system and 0 to 128 in the saturation value. The image processing method according to claim 1, wherein the image processing method is divided into skin color regions.
  10. The step of dividing the captured image data into regions composed of a predetermined combination of hue and saturation comprises at least 0 to 69 in the HSV color system and 0 to 128 in the saturation value. Divide into flesh-colored areas,
    The step of dividing the captured image data into a predetermined hue area includes dividing the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, and an empty area of 185 to 224 in the hue value of the HSV color system. The image processing method according to claim 4, wherein the image processing method is divided into a hue area and a red hue area of 225 to 360.
  11.   The determination of the gradation processing condition is performed by adjusting the gradation conversion curve by creating a gradation conversion curve or selecting from a plurality of preset gradation conversion curves. The image processing method as described in any one of Claims 1-10.
  12.   The image processing method according to claim 1, wherein the captured image data is scene reference image data.
  13.   The image processing method according to claim 1, wherein the image data optimized for viewing on the output medium is viewing image reference data.
  14. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image processing apparatus comprising:
  15. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image processing apparatus comprising:
  16. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image processing apparatus comprising:
  17. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image processing apparatus comprising:
  18. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image processing apparatus comprising:
  19. In an image processing apparatus for inputting captured image data and outputting image data optimized for viewing on an output medium,
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image processing apparatus comprising:
  20. Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
    The HS dividing unit divides the captured image data into regions composed of combinations of predetermined hues and saturations based on the created two-dimensional histogram. The image processing apparatus according to item.
  21. Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
    The HS dividing unit divides the captured image data into regions including combinations of a predetermined hue and saturation based on the created two-dimensional histogram,
    The image processing according to any one of claims 17 to 19, wherein the hue area dividing unit divides the captured image data into a predetermined hue area based on the created two-dimensional histogram. apparatus.
  22.   The HS dividing unit divides the captured image data into skin color regions each having a hue value of at least HSV color system of 0 to 69 and a saturation value of 0 to 128. The image processing apparatus according to any one of 20.
  23. The HS dividing means divides the captured image data into skin color regions consisting of at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128,
    The hue area dividing unit converts the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, a sky hue area of 185 to 224, and a red hue of 225 to 360 in the hue value of the HSV color system. The image processing device according to claim 17, wherein the image processing device is divided into phase regions.
  24.   The determination of the gradation processing condition is performed by adjusting the gradation conversion curve by creating a gradation conversion curve or selecting from a plurality of preset gradation conversion curves. The image processing apparatus according to any one of claims 14 to 23.
  25.   The image processing apparatus according to any one of claims 14 to 24, wherein the captured image data is scene reference image data.
  26.   The image processing apparatus according to any one of claims 14 to 25, wherein the image data optimized for viewing on the output medium is viewing image reference data.
  27. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image recording apparatus comprising:
  28. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image recording apparatus comprising:
  29. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of a combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    An image recording apparatus comprising:
  30. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image recording apparatus comprising:
  31. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    HS occupancy ratio calculating means for calculating an occupancy ratio indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of the combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculating means for calculating a low saturation threshold value for each hue region according to the calculated occupancy ratio for each hue region;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image recording apparatus comprising:
  32. In an image recording apparatus that inputs captured image data, generates image data optimized for viewing on an output medium, and forms the generated image data on the output medium.
    Data acquisition means for obtaining a hue value, a saturation value and a brightness value for each pixel of the captured image data;
    HS dividing means for dividing the captured image data into regions composed of combinations of predetermined hue and saturation;
    A lightness deviation amount calculating means for calculating a lightness deviation amount of an area composed of a combination of the predetermined hue and saturation divided in the entire captured image data;
    HS occupancy rate calculating means for calculating an occupancy rate indicating the ratio of pixels of the divided predetermined hue and saturation combination to the entire screen of the captured image data;
    Based on the calculated occupancy rate, contribution rate calculating means for calculating a contribution rate to the gradation conversion processing of the region composed of the combination of the predetermined hue and saturation divided,
    Gradation conversion processing condition determining means for determining gradation conversion processing conditions based on the calculated brightness deviation amount and contribution rate;
    Gradation conversion processing means for performing gradation conversion processing on the captured image data based on the determined gradation conversion processing conditions;
    Hue area dividing means for dividing the captured image data into predetermined hue areas;
    A hue area occupancy ratio calculating unit that calculates an occupancy ratio indicating a ratio of pixels for each of the divided hue areas to the entire screen of the captured image data;
    Low saturation threshold value calculation means for calculating a low saturation threshold value for each hue area according to the calculated occupancy ratio for each hue area;
    Low saturation pixel extraction means for extracting low saturation pixels used for gray balance adjustment based on the calculated low saturation threshold;
    A gray balance adjustment condition calculating means for calculating a gray balance adjustment condition using the extracted low saturation pixel;
    Based on the calculated gray balance adjustment conditions, gray balance adjustment means for performing gray balance adjustment on the captured image data;
    An image recording apparatus comprising:
  33. Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
    30. The HS dividing unit according to any one of claims 27 to 29, wherein the HS dividing unit divides the captured image data into regions including combinations of a predetermined hue and saturation based on the created two-dimensional histogram. The image recording apparatus described in the item.
  34. Two-dimensional histogram creation means for creating a two-dimensional histogram of the acquired hue value and saturation value;
    The HS dividing unit divides the captured image data into regions including combinations of a predetermined hue and saturation based on the created two-dimensional histogram,
    The image recording according to any one of claims 30 to 32, wherein the hue area dividing unit divides the captured image data into predetermined hue areas based on the created two-dimensional histogram. apparatus.
  35.   30. The HS dividing means divides the captured image data into skin color regions having at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128, respectively. 34. The image recording device according to any one of 33.
  36. The HS dividing means divides the captured image data into skin color regions consisting of at least a hue value of HSV color system of 0 to 69 and a saturation value of 0 to 128,
    The hue area dividing unit converts the captured image data into a skin hue area of 0 to 69, a green hue area of 70 to 184, a sky hue area of 185 to 224, and a red hue of 225 to 360 in the hue value of the HSV color system. 35. The image recording apparatus according to claim 30, wherein the image recording apparatus is divided into phase regions.
  37.   The determination of the gradation processing condition is performed by adjusting the gradation conversion curve by creating a gradation conversion curve or selecting from a plurality of preset gradation conversion curves. The image recording apparatus according to any one of claims 27 to 36.
  38.   38. The image recording apparatus according to claim 27, wherein the captured image data is scene reference image data.
  39.   39. The image recording apparatus according to claim 27, wherein the image data optimized for viewing on the output medium is viewing image reference data.
JP2003434706A 2003-12-26 2003-12-26 Image processing method, image processing apparatus, and image recording apparatus Pending JP2005192162A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003434706A JP2005192162A (en) 2003-12-26 2003-12-26 Image processing method, image processing apparatus, and image recording apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003434706A JP2005192162A (en) 2003-12-26 2003-12-26 Image processing method, image processing apparatus, and image recording apparatus

Publications (1)

Publication Number Publication Date
JP2005192162A true JP2005192162A (en) 2005-07-14

Family

ID=34791683

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003434706A Pending JP2005192162A (en) 2003-12-26 2003-12-26 Image processing method, image processing apparatus, and image recording apparatus

Country Status (1)

Country Link
JP (1) JP2005192162A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007124604A (en) * 2005-09-29 2007-05-17 Fujifilm Corp Image processing apparatus and processing method therefor
JP2008054297A (en) * 2006-07-25 2008-03-06 Fujifilm Corp System for and method of photographing image, and computer program
JP2009077431A (en) * 2008-12-04 2009-04-09 Seiko Epson Corp Image processing apparatus, and image processing method
JP2010049312A (en) * 2008-08-19 2010-03-04 Casio Comput Co Ltd Image processing device, image processing method, and program
KR20110071704A (en) * 2009-12-21 2011-06-29 삼성전자주식회사 Method and apparatus for image scanning
US8036455B2 (en) 2006-02-13 2011-10-11 Seiko Epson Corporation Method and apparatus of analyzing and generating image data
JP2014120919A (en) * 2012-12-17 2014-06-30 Samsung Display Co Ltd Image processor, image processing method, and program
US8797423B2 (en) 2006-07-25 2014-08-05 Fujifilm Corporation System for and method of controlling a parameter used for detecting an objective body in an image and computer program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007124604A (en) * 2005-09-29 2007-05-17 Fujifilm Corp Image processing apparatus and processing method therefor
US8036455B2 (en) 2006-02-13 2011-10-11 Seiko Epson Corporation Method and apparatus of analyzing and generating image data
JP2008054297A (en) * 2006-07-25 2008-03-06 Fujifilm Corp System for and method of photographing image, and computer program
US8797423B2 (en) 2006-07-25 2014-08-05 Fujifilm Corporation System for and method of controlling a parameter used for detecting an objective body in an image and computer program
JP2010049312A (en) * 2008-08-19 2010-03-04 Casio Comput Co Ltd Image processing device, image processing method, and program
JP2009077431A (en) * 2008-12-04 2009-04-09 Seiko Epson Corp Image processing apparatus, and image processing method
KR20110071704A (en) * 2009-12-21 2011-06-29 삼성전자주식회사 Method and apparatus for image scanning
KR101628238B1 (en) 2009-12-21 2016-06-21 삼성전자주식회사 Method and apparatus for image scanning
JP2014120919A (en) * 2012-12-17 2014-06-30 Samsung Display Co Ltd Image processor, image processing method, and program

Similar Documents

Publication Publication Date Title
US8374429B2 (en) Image processing method, apparatus and memory medium therefor
US8743272B2 (en) Image processing apparatus and method of controlling the apparatus and program thereof
JP4263890B2 (en) System and method for determining when to correct a particular image defect based on camera, scene, display, and demographic data
US7609908B2 (en) Method for adjusting the brightness of a digital image utilizing belief values
US6097470A (en) Digital photofinishing system including scene balance, contrast normalization, and image sharpening digital image processing
KR100667663B1 (en) Image processing apparatus, image processing method and computer readable recording medium which records program therefore
KR100524565B1 (en) Method and apparatus for processing image data, and storage medium
US5978519A (en) Automatic image cropping
JP3492202B2 (en) Image processing method, apparatus and recording medium
CN100389592C (en) Image processing apparatus for print process of photographed image
US8280188B2 (en) System and method for making a correction to a plurality of images
US6469805B1 (en) Post raster-image processing controls for digital color image printing
DE69913534T2 (en) Method and device for image generation
US7453598B2 (en) Image processing method and apparatus, and color conversion table generation method and apparatus
US7555140B2 (en) Image processing using object information
US7289664B2 (en) Method of detecting and correcting the red eye
US7720279B2 (en) Specifying flesh area on image
US7751644B2 (en) Generation of image quality adjustment information and image quality adjustment with image quality adjustment information
US6975437B2 (en) Method, apparatus and recording medium for color correction
US7330195B2 (en) Graphic pieces for a border image
JP4160335B2 (en) Convert color image to gray value image
JP5743384B2 (en) Image processing apparatus, image processing method, and computer program
JP4324043B2 (en) Image processing apparatus and method
US7715050B2 (en) Tonescales for geographically localized digital rendition of people
KR100724869B1 (en) Image processing apparatus, image processing method, and computer-readable recording medium for storing image processing program