WO2005112428A1 - 画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム - Google Patents
画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム Download PDFInfo
- Publication number
- WO2005112428A1 WO2005112428A1 PCT/JP2005/008412 JP2005008412W WO2005112428A1 WO 2005112428 A1 WO2005112428 A1 WO 2005112428A1 JP 2005008412 W JP2005008412 W JP 2005008412W WO 2005112428 A1 WO2005112428 A1 WO 2005112428A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- index
- hue
- image processing
- lightness
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6083—Colour correction or control controlled by factors external to the apparatus
- H04N1/6086—Colour correction or control controlled by factors external to the apparatus by scene illuminant, i.e. conditions at the time of picture capture, e.g. flash, optical filter used, evening, cloud, daylight, artificial lighting, white point measurement, colour temperature
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6027—Correction or control of colour gradation or colour contrast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
- H04N1/628—Memory colours, e.g. skin or sky
Definitions
- Image processing method image processing device, image recording device, and image processing program
- the present invention relates to an image processing method, an image processing device, an image recording device for forming an image on an output medium, and an image processing program.
- Patent Document 1 discloses a method of calculating an additional correction value instead of the discriminant regression analysis method.
- the method described in Patent Document 1 removes a high-luminance region and a low-luminance region from a luminance histogram indicating the cumulative number of luminance pixels (frequency number), and further uses a frequency-restricted one. An average value is calculated, and a difference between the average value and the reference luminance is obtained as a correction value.
- Patent Document 2 discloses a method of determining a light source state at the time of shooting in order to compensate for extraction accuracy of a face region. The method described in Patent Document 2 first extracts a face candidate region, calculates a deviation of the average luminance of the extracted face candidate region with respect to the entire image, and, when the deviation amount is large, sets a photographing scene (backlight photographing strobe light). It determines the close-up shooting power) and adjusts the allowable range of the criterion for the face area. Patent Document 2 discloses a method using a two-dimensional histogram of hue and saturation described in JP-A-6-67320, JP-A-8-122944, and JP-A-8-122944 as methods for extracting a face candidate region.
- Patent Document 2 discloses a method of removing a background region other than a face, which is described in JP-A-8-122944 and JP-A-8-184925, which describes a ratio of a linear portion, a line symmetry, and an image.
- a method of discriminating using a contact rate with the outer edge of the surface, a density contrast, a density change pattern and periodicity is cited.
- a method using a one-dimensional histogram of density is described for discrimination of a shooting scene. This method is based on an empirical rule that in the case of backlight, the face area is dark and the background area is bright, and in the case of close-up flash photography, the face area is bright and the background area is dark.
- Patent Document 1 JP-A-2002-247393
- Patent Document 2 Japanese Patent Application Laid-Open No. 2000-148980
- Patent Document 1 reduces the influence of a region having a large luminance bias in a backlight or a strobe scene
- the technology described in Patent Document 1 reduces a face region in a shooting scene in which a person is a main subject. There was a problem that the brightness of the image was inappropriate.
- the technique described in Patent Document 2 is capable of achieving the effect of compensating for the identification of the face area. There was a problem that the compensation effect could not be obtained.
- An object of the present invention is to calculate an index that quantitatively represents a photographic scene (light source condition) of photographic image data, and determine an image processing condition based on the calculated index, thereby obtaining an image processing object. It is to improve lightness reproducibility.
- the captured image data is obtained by combining at least one of a predetermined brightness and hue, a distance from the outer edge of the screen of the captured image data, and brightness.
- FIG. 1 is a perspective view showing an external configuration of an image recording apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing an internal configuration of the image recording device of the present embodiment.
- FIG. 3 is a block diagram showing a main part configuration of an image processing unit in FIG. 2.
- FIG. 4 is a block diagram showing an internal configuration of a scene determination unit and an internal configuration of a ratio calculation unit.
- FIG. 5 is a flowchart showing a scene determination process executed in an image adjustment processing unit.
- FIG. 6 is a flow chart showing an occupancy calculation process for calculating a first occupancy for each brightness / hue area.
- FIG. 7 is a diagram showing an example of a program for converting from RGB to HSV color system.
- FIG. 8 is a diagram showing a lightness (V) -hue (H) plane, and a region rl and a region r2 on a VH plane.
- FIG. 9 is a diagram showing a lightness (V) -hue (H) plane and a region r3 and a region r4 on a VH plane.
- FIG. 10 is a view showing a curve representing a first coefficient to be multiplied by a first occupancy for calculating the index 1;
- FIG. 11 is a view showing a curve representing a second coefficient for multiplying the first occupancy for calculating the index 2;
- FIG. 12 is a flowchart showing occupancy calculation processing for calculating a second occupancy based on the composition of captured image data.
- FIG. 13 is a diagram showing regions nl to n4 determined according to the distance of the captured image data from the outer edge of the screen ((&) to (01)).
- FIG. 14 is a diagram showing, for each region (nl to n4), a curve representing a third coefficient for multiplying the second occupancy for calculating the index 3;
- FIG. 15 A plot of index 4 and index 5 calculated for each shooting scene (direct light, strobe, backlight).
- FIG. 16 is a diagram showing a frequency distribution (histogram) of luminance (a), a normalized histogram (b), and a histogram divided into blocks (c).
- FIG. 17 A diagram (17 (a) and 17 (b)) for explaining the deletion of a low-luminance region and a high-luminance region from a luminance histogram, and a diagram (17 (c)) for explaining a restriction on the frequency of luminance. And 17 (d)).
- FIG. 18 is a diagram showing a gradation conversion curve representing an image processing condition (gradation conversion condition) when a shooting scene is backlit.
- the mode described in item 2 is the image processing method according to item 1,
- the captured image data is divided into a plurality of regions each having a predetermined combination of lightness and hue, and a ratio of the divided plurality of regions to the entire captured image data is indicated. Calculate the occupancy,
- a coefficient having a different sign between a coefficient used for a predetermined high lightness skin color hue area and a coefficient used for a hue area other than the high lightness skin color hue area, or an intermediate lightness area of the skin color hue area is calculated by using at least one of the coefficients having different codes between the coefficient used for the above and the coefficient used for the brightness area other than the intermediate brightness area.
- the mode described in Item 3 is the image processing method according to Item 2,
- a sign used for a predetermined high lightness skin color hue region and a sign of a coefficient used for a hue region other than the high lightness skin color hue region are different.
- the mode described in item 4 is the image processing method according to item 2,
- the sign of a coefficient used for an intermediate lightness area of the skin color hue area and a sign of a coefficient used for a lightness area other than the intermediate lightness area are different.
- a first index is calculated using a different sign coefficient between a predetermined high lightness skin hue area and a hue area other than the high lightness skin hue area, and a first hue area of the skin hue area is calculated.
- a second index is calculated using coefficients of different signs in the intermediate brightness area and a brightness area other than the intermediate brightness area, and in the image processing condition determination step, the calculated first index and the second index are calculated. The image processing condition is determined based on the index.
- the mode described in Item 6 is the image processing method according to any one of Items 2 to 5, wherein the accumulated pixel number is calculated for each predetermined hue and brightness of the captured image data.
- the method further includes a histogram creation step of creating a histogram. In the occupancy calculation step, the occupancy is calculated based on the created two-dimensional histogram.
- the mode according to item 7 is the image processing method according to any one of items 2, 3, 5, and 6, wherein the predetermined high lightness skin color hue region has a coefficient having a sign different from that of the predetermined high lightness skin color hue region.
- the lightness area of the hue area other than the high lightness skin color hue area is a predetermined high lightness area.
- the mode according to Item 8 is the image processing method according to any one of Items 2, 4, 5, and 6, wherein the image processing method has a coefficient whose sign is different from that of the intermediate lightness area of the skin color hue area.
- a hue area in a lightness area other than the intermediate lightness area is a hue area in the skin color hue area.
- the mode described in Item 9 is the image processing method according to any one of Items 2, 3, 5 to 7,
- the high lightness skin color hue region includes a lightness value range of 170 to 224 in the HSV color system.
- the mode described in Item 10 is the image processing method according to any one of Items 2, 4 to 6, and 8, wherein the intermediate lightness region has a lightness value of 85 in the HSV color system.
- ⁇ Includes an area in the range of 169.
- the mode described in Item 11 is the image processing method according to any one of Items 2, 3, 5 to 7, and 9, wherein the hue region other than the high lightness skin color hue region has a blue hue. Area and a green hue area.
- the lightness region other than the intermediate lightness region is a shadow region.
- the form described in Item 13 is the image processing method according to Item 11, wherein the hue value of the blue hue region is in the range of 161 to 250 in the HSV color system, and The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- Item 14 is the image processing method according to item 12, wherein the lightness value of the shadow area is in the range of 26 to 84 in the lightness value of the HSV color system.
- the mode described in Item 15 is the image processing method according to any one of Items 2 to 14,
- the hue value of the skin color hue region is in the range of 0 to 39 and 330 to 359 in the HSV color system.
- the mode described in Item 16 is the image processing method according to any one of Items 2 to 15, wherein the skin color hue area is divided into two areas by a predetermined conditional expression based on lightness and saturation. Cracked.
- the captured image data is divided into a plurality of predetermined regions each including a combination of a distance from the outer edge of the screen of the captured image data and brightness, and for each of the plurality of divided regions,
- the occupancy rate which indicates the proportion of the entire captured image data, is ⁇ ⁇ 4,
- a coefficient having a different value depending on the distance from the outer edge is used.
- the captured image data is divided into a plurality of regions each having a predetermined combination of brightness and hue, and a ratio of the divided plurality of regions to the entire captured image data is indicated.
- the first occupancy is calculated, and the captured image data is divided into a plurality of predetermined regions each including a combination of a distance from the outer edge of the screen of the captured image data and brightness, and for each of the plurality of divided regions.
- the calculated first occupancy rate and second occupancy rate are calculated according to shooting conditions.
- an index that specifies the shooting scene is calculated, and in the index calculation step, a predetermined high lightness skin hue area and a hue area other than the high lightness skin hue area are calculated.
- the first index is calculated using coefficients of different signs
- the second index is calculated using coefficients of different signs in the intermediate lightness area of the flesh color hue area and the lightness areas other than the intermediate lightness area.
- a third index is calculated using a coefficient having a different value according to the distance of the outer edge force.
- the calculated first index, second index and third index are calculated.
- the image processing conditions are determined based on [0029]
- the mode described in Item 20 is the image processing method according to Item 19,
- the first index, the second index, and the third index are each multiplied by a coefficient set in advance according to a shooting condition and synthesized, thereby obtaining a fourth index and a fourth index.
- a fifth index is calculated, and in the image processing condition determining step, the image processing condition is determined based on the calculated fourth and fifth indexes.
- the mode described in Item 21 is the image processing method according to Item 19 or 20,
- Item 23 is the image processing method according to any one of Items 19 to 22, wherein the high lightness has a coefficient having a sign different from that of the predetermined high lightness skin hue region.
- the lightness area of the hue area other than the skin color hue area is a predetermined high lightness area.
- the mode described in Item 24 is the image processing method according to any one of Items 19 to 22, wherein the intermediate lightness has a coefficient having a sign different from that of the intermediate lightness area of the skin color hue area.
- the hue area of the lightness area other than the area is the hue area in the skin color hue area.
- the mode described in Item 25 is the image processing method according to any one of Items 19 to 23,
- the high lightness skin color hue region includes a lightness value range of 170 to 224 in the HSV color system.
- Item 26 is the image processing method according to any one of Items 19 to 22, 24, wherein the intermediate lightness region has a lightness value of an HSV color system of 85 to: A range of 169 areas is included.
- the mode described in Item 27 is the image processing method according to any one of Items 19 to 23 and 25.
- the hue area other than the high lightness skin color hue area includes at least one of a blue hue area and a green hue area.
- a lightness area other than the intermediate lightness area is a shadow area.
- the form described in Item 29 is the image processing method according to Item 27, wherein a hue value of the blue hue region is in a range of 161 to 250 in a hue value of an HSV color system, and The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- Item 30 is the image processing method according to item 28, wherein the lightness value of the shadow region is in the range of 26 to 84 in the lightness value of the HSV color system.
- the form described in Item 31 is the image processing method according to any one of Items 19 to 30, wherein the hue value of the flesh color hue region is 0 to 39 as a hue value of an HSV color system. It is in the range of 330-359.
- Item 32 is the image processing method according to any one of Items 19 to 31, wherein the skin color hue region is divided into two regions by a predetermined conditional expression based on lightness and saturation. Divided.
- Item 33 is the image processing method according to any one of Items 1 to 32, wherein in the image processing condition determining step, an image processing condition for performing a gradation conversion process on the captured image is used. Is determined.
- the mode described in the paragraph 34 is the image processing method according to any one of the paragraphs 1 to 33, wherein the coefficient preset according to the imaging condition is determined by using a discriminant analysis method. It is a coefficient.
- the mode described in Item 35 is the image processing method according to Item 34, wherein the coefficient set in advance according to the imaging condition is such that a discriminant function is obtained for a plurality of sample images prepared for each imaging condition. This is the value of the discrimination coefficient adjusted to satisfy a predetermined condition.
- the captured image data is divided into a plurality of areas each including a combination of at least one of predetermined brightness and hue, a distance from the outer edge of the screen of the captured image data, and brightness. And each divided area occupies the whole of the photographed image data.
- An occupancy calculation unit that calculates an occupancy indicating a ratio
- An index calculation unit that calculates an index that specifies a shooting scene by multiplying the calculated occupancy of each area by a coefficient that is set in advance according to shooting conditions;
- An image processing condition determining unit that determines an image processing condition for the captured image data based on the calculated index.
- the occupancy calculating unit divides the photographed image data into a plurality of regions each having a predetermined combination of brightness and hue, and for each of the plurality of divided regions, an occupancy ratio indicating a ratio of the occupied image data to the entire region. Is calculated,
- the index is calculated using at least one of the coefficients having different signs between the coefficient used for the intermediate lightness area of the skin color hue area and the coefficient used for the lightness area other than the intermediate lightness area.
- Item 38 is the image processing device according to Item 37, wherein in the index calculation unit, a coefficient used for a predetermined high lightness skin color hue area and a coefficient other than the high lightness skin color hue area are used. The signs of the coefficients used for the hue area are different.
- the form described in Item 39 is the image processing device according to Item 37, wherein the index calculation unit uses a coefficient used for an intermediate lightness area of the skin color hue area and a coefficient used for a lightness area other than the intermediate lightness area. The signs of the coefficients are different.
- the mode described in Item 40 is the image processing device according to Item 37, wherein
- a first index is calculated using a different sign coefficient between a predetermined high lightness skin color hue area and a hue area other than the high lightness skin color hue area
- a second index is calculated using a coefficient with a different sign in the brightness region and a brightness region other than the intermediate brightness region
- the image processing condition is determined based on the calculated first index and second index.
- the mode described in Item 41 is characterized in that the image processing device according to any one of Items 37 to 40 calculates an accumulated number of pixels for each predetermined hue and brightness of the captured image data. Including a histogram creation unit for creating a two-dimensional histogram,
- the occupancy calculation unit calculates the occupancy based on the created two-dimensional histogram.
- the mode described in Item 42 is the image processing device according to any one of Items 37, 38, 40 and 41, wherein the predetermined high lightness skin color hue region has a coefficient having a sign different from that of the predetermined high lightness skin color hue region.
- the lightness area of the hue area other than the high lightness skin color hue area is a predetermined high lightness area.
- the form according to Item 43 is the image processing device according to any one of Items 37, 39, 40, and 41, wherein the image processing device has a coefficient having a sign different from that of the intermediate lightness region of the skin color hue region.
- the hue area in the lightness area other than the intermediate lightness area is the hue area in the skin color hue area.
- the form described in Item 44 is the image processing apparatus according to any one of Items 37, 38, 40 to 42, wherein the high lightness skin color hue region has a lightness value of HSV color system of 170.
- the range of ⁇ 224 is included.
- Item 45 is the image processing apparatus according to any one of Items 37, 39 to 41, and 43, wherein the intermediate lightness area has a lightness value of 85 to 169 in the HSV color system. The range of the range is included.
- the mode described in Item 46 is the image processing device according to any one of Items 37, 39, 40 to 42, and 44, wherein the hue region other than the high lightness skin color hue region has a blue color. At least one of a hue region and a green hue region is included.
- Item 47 is the image processing device according to any one of Items 37, 39 to 41, 43, and 45, wherein the lightness area other than the intermediate lightness area is a shadow area.
- Item 48 is the image processing device according to Item 46, wherein the hue value of the blue hue region is in the range of 161 to 250 as a hue value of an HSV color system, and The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- the form described in Item 49 is the image processing device according to Item 47, wherein the lightness value of the shadow region is in the range of 26 to 84 in the lightness value of the HSV color system.
- the mode described in Item 50 is the image processing device according to any one of Items 37 to 49.
- the hue value of the skin color hue region is in the range of 0 to 39 and 330 to 359 in the HSV color system.
- the mode described in Item 51 is the image processing device according to any one of Items 37 to 50, wherein the skin color hue area is divided into two areas by a predetermined conditional expression based on lightness and saturation. Divided.
- the mode described in Item 52 is the image processing device according to Item 36,
- the occupancy calculating unit divides the photographed image data into a plurality of predetermined regions each including a combination of a distance from the outer edge of the screen of the photographed image data and brightness, and performs the photographing for each of the plurality of divided regions.
- An occupancy rate indicating the ratio of the image data to the entire image data is calculated, and the index calculation unit uses a coefficient having a different value according to the distance from the outer edge.
- the form described in Item 53 is the image processing device according to Item 52, wherein the two-dimensional histogram is obtained by calculating the cumulative number of pixels for each distance and brightness from the outer edge of the screen of the captured image data.
- the occupancy ratio is calculated based on the generated two-dimensional histogram in the occupancy ratio calculation unit.
- the mode described in Item 54 is the image processing device according to Item 36, wherein
- the captured image data is divided into a plurality of regions each having a predetermined combination of brightness and hue, and for each of the plurality of divided regions, a ratio of the divided image region to the entire captured image data is calculated.
- the captured image data is divided into a plurality of predetermined regions each including a combination of the distance from the outer edge of the screen and the brightness of the captured image data. Then, a second occupancy ratio indicating a ratio of the occupied image data to the whole is calculated,
- the calculated first occupancy rate and the second occupancy rate are multiplied by a coefficient set in advance according to shooting conditions to calculate an index for specifying a shooting scene.
- a first index is calculated using a coefficient of a different sign between a predetermined high lightness skin color hue region and a hue region other than the high lightness skin color hue region, The brightness area differs from the brightness area other than the intermediate brightness area.
- a second index is calculated using a coefficient of the sign, and a third index is calculated using a coefficient having a different value according to the distance of the outer edge force,
- the image processing condition determining unit determines the image processing condition based on the calculated first index, second index, and third index.
- the mode described in Item 55 is the image processing device according to Item 54,
- the index calculating unit multiplies each of the first index, the second index, and the third index by a coefficient set in advance according to a shooting condition, and synthesizes the fourth index and the fourth index. 5 indicators are calculated,
- the image processing condition is determined based on the calculated fourth and fifth indices.
- the mode described in Item 56 is the image processing device according to Item 54 or 55, wherein the two-dimensional histogram is calculated by calculating the cumulative number of pixels for each distance and brightness from the outer edge of the screen of the captured image data. And a occupancy ratio calculation unit that calculates the second occupancy ratio based on the created two-dimensional histogram.
- the form described in Item 57 is the image processing device according to Item 54 or 55, wherein the two-dimensional histogram is created by calculating the cumulative number of pixels for each predetermined hue and brightness of the captured image data.
- the first occupancy ratio is calculated based on the generated two-dimensional histogram in the occupancy ratio calculation unit.
- the form described in Item 58 is the image processing device according to any one of Items 54 to 57, wherein the high lightness has a coefficient having a sign different from that of the predetermined high lightness skin color hue region.
- the lightness area of the hue area other than the skin color hue area is a predetermined high lightness area.
- the mode according to Item 59 is the image processing device according to any one of Items 54 to 57, wherein the intermediate brightness has a coefficient having a sign different from that of the intermediate brightness area of the skin color hue area.
- the hue area of the lightness area other than the area is the hue area in the skin color hue area.
- the mode described in Item 60 is the image processing device according to any one of Items 54 to 58,
- the high lightness skin color hue region includes a lightness value range of 170 to 224 in the HSV color system.
- the form described in Item 61 is the image processing device according to any one of Items 54 to 57 and 59, wherein the intermediate lightness region has an HSV color system lightness value of 85 to: A range of 169 areas is included.
- the mode described in Item 62 is the image processing device according to any one of Items 54 to 58 and 60, wherein the hue area other than the high brightness skin hue area includes a blue hue area, At least one of the green hue regions is included.
- Item 63 is the image processing device according to any one of Items 54 to 57, 59, and 61, wherein the lightness area other than the intermediate lightness area is a shadow area.
- the form described in Item 64 is the image processing device according to Item 62, wherein the hue value of the blue hue region is in the range of 161 to 250 in the HSV color system, and The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- Item 65 The form described in Item 65 is the image processing device according to Item 63, wherein the lightness value of the shadow region is in the range of 26 to 84 in the lightness value of the HSV color system.
- the form according to Item 66 is the image processing device according to any one of Items 54 to 65, wherein the hue value of the skin color hue region is 0 to 39 as a hue value of an HSV color system. It is in the range of 330-359.
- Item 67 is the image processing device according to any one of Items 54 to 66, wherein the skin color hue region is divided into two regions by a predetermined conditional expression based on lightness and saturation. Divided.
- Item 68 The form described in Item 68 is the image processing device according to any one of Items 36 to 67,
- the image processing condition determining unit determines image processing conditions for performing a gradation conversion process on the captured image.
- the coefficient preset according to the imaging condition is obtained using a discriminant analysis method. It is a discrimination coefficient.
- the form described in Item 70 is the image processing device according to Item 69, wherein the coefficient set in advance according to the imaging condition is such that a discriminant function is determined for a plurality of sample images prepared for each imaging condition. This is the value of the discrimination coefficient adjusted to satisfy a predetermined condition.
- the form described in Item 71 is
- the captured image data is divided into a plurality of regions each of which is a combination of at least one of predetermined brightness and hue, a distance from the outer edge of the screen of the captured image data, and brightness, and the photographing is performed for each of the divided regions.
- An occupancy calculating unit for calculating an occupancy indicating a ratio of the image data to the entire image data;
- An index calculation unit that calculates an index that specifies a shooting scene by multiplying the calculated occupancy of each area by a coefficient that is set in advance according to shooting conditions;
- An image processing condition determining unit that determines an image processing condition for the captured image data based on the calculated index
- An image processing unit that performs image processing on the captured image data according to the determined image processing conditions
- An image data forming unit that forms the image data subjected to the image processing on an output medium.
- the mode described in Item 72 is the image recording device according to Item 71,
- the occupancy calculating unit divides the captured image data into a plurality of regions each having a predetermined combination of brightness and hue, and for each of the plurality of divided regions, an occupancy ratio indicating a ratio of the occupied area to the entire photographic image data. Is calculated,
- the index is calculated by using at least one of the coefficients having different signs between the coefficient used for the intermediate lightness area of the skin color hue area and the coefficient used for the lightness area other than the intermediate lightness area.
- Item 73 The form described in Item 73 is the image recording device described in Item 72, wherein
- the sign of a coefficient used for a predetermined high lightness skin color hue region and the sign of a coefficient used for a hue region other than the high lightness skin color hue region are different.
- the mode described in Item 74 is the image recording device according to Item 72, wherein
- a coefficient used for an intermediate lightness area of a skin color hue area is calculated.
- the signs of the coefficients used in the lightness area other than the intermediate lightness area are different.
- the mode described in Item 75 is the image recording device according to Item 72, wherein
- a first index is calculated using a different sign coefficient between a predetermined high lightness skin color hue area and a hue area other than the high lightness skin color hue area
- a second index is calculated using a coefficient with a different sign in the brightness region and a brightness region other than the intermediate brightness region
- the image processing condition is determined based on the calculated first index and second index.
- the mode described in Item 76 is the image recording device according to any one of Items 72 to 75,
- a histogram creating unit that creates a two-dimensional histogram by calculating the cumulative number of pixels for each predetermined hue and brightness of the captured image data, wherein the occupancy calculating unit calculates the occupancy based on the created two-dimensional histogram. The occupancy is calculated.
- the form according to Item 77 is the image recording device according to any one of Items 72, 73, 75, and 76, wherein the predetermined high lightness skin color hue region has a coefficient having a sign different from that of the predetermined high lightness skin color hue region.
- the lightness area of the hue area other than the high lightness skin color hue area is a predetermined high lightness area.
- the form according to Item 78 is the image recording device according to any one of Items 72, 74, 75 and 76, wherein the image recording apparatus has a coefficient having a sign different from that of the intermediate lightness area of the skin color hue area.
- the hue area in the lightness area other than the intermediate lightness area is the hue area in the skin color hue area.
- the form according to Item 79 is the image recording device according to any one of Items 72, 73, and 75 to 77, wherein the high lightness skin color hue region has a lightness value of an HSV color system. In the range of 170 to 224.
- the mode described in Item 80 is the image recording device according to any one of Items 72, 74 to 76, and 78, wherein the intermediate lightness area has a lightness value of 85 to 85 in the HSV color system. : Includes a range of 169 areas.
- the mode according to Item 81 is the image recording device according to any one of Items 72, 73, 75 to 77, and 79, wherein the hue region other than the high lightness skin color hue region includes blue. At least one of a hue region and a green hue region is included.
- the form described in Item 82 is the image recording device according to any one of Items 72, 74 to 76, 78, and 80, wherein a lightness area other than the intermediate lightness area is a shadow area.
- the hue value of the blue hue region is within the range of 16 :! to 250 in the HSV color system, and the hue value of the green hue region is 40 to 160 in the HSV color system. Within the range.
- Item 84 The form described in Item 84 is the image recording device according to Item 92, wherein the lightness value of the shadow area is in the range of 26 to 84 in the lightness value of the HSV color system.
- Item 85 The form described in Item 85 is the image recording apparatus according to any one of Items 72 to 84, wherein the hue value of the flesh color hue region is 0 to 39 as a hue value of an HSV color system. It is in the range of 330-359.
- Item 86 The form described in Item 86 is the image recording device according to any one of Items 72 to 85, wherein the skin color hue region is divided into two regions by a predetermined conditional expression based on lightness and saturation. Divided.
- the occupancy calculating unit divides the photographed image data into a plurality of predetermined regions each including a combination of a distance from the outer edge of the screen of the photographed image data and brightness, and performs the photographing for each of the plurality of divided regions. Calculate the occupancy rate that indicates the proportion of the entire image data,
- coefficients having different values are used according to the distance from the outer edge.
- Item 88 The form described in Item 88 is the image recording device according to Item 87, wherein
- a histogram creation unit that creates a two-dimensional histogram by calculating the cumulative number of pixels for each of the distance and brightness from the outer edge of the screen of the captured image data
- the occupancy calculation unit calculates the occupancy based on the created two-dimensional histogram.
- the captured image data is converted into a combination of predetermined brightness and hue. Calculating a first occupancy rate indicating a ratio of the captured image data to the entire captured image data for each of the plurality of divided areas, and converting the captured image data into the captured image data.
- the data is divided into a plurality of predetermined regions each having a combination of the distance from the outer edge of the screen and the brightness, and a second occupancy ratio is calculated for each of the plurality of divided regions, which indicates a ratio of the data to the entire captured image data.
- the calculated first occupancy rate and the second occupancy rate are multiplied by a coefficient set in advance according to shooting conditions to calculate an index for specifying a shooting scene.
- a first index is calculated using a coefficient of a different sign between a predetermined high lightness skin color hue region and a hue region other than the high lightness skin color hue region
- a second index is calculated using a coefficient with a different sign in the brightness area and a brightness area other than the intermediate brightness area
- a third index is calculated using a coefficient having a different value according to the distance of the outer edge force.
- the image processing condition determining unit determines the image processing condition based on the calculated first index, second index, and third index.
- the form according to Item 90 is the image recording device according to Item 89,
- the index calculating unit multiplies each of the first index, the second index, and the third index by a coefficient set in advance according to a shooting condition, and synthesizes the fourth index and the fourth index. 5 indicators are calculated,
- the image processing condition is determined based on the calculated fourth and fifth indices.
- the form according to Item 91 is the image recording device according to Item 89 or 90,
- a histogram creation unit that creates a two-dimensional histogram by calculating the cumulative number of pixels for each of the distance and brightness from the outer edge of the screen of the captured image data
- the occupancy calculation unit calculates the second occupancy based on the created two-dimensional histogram.
- the form according to Item 92 is the image recording device according to Item 89 or 90,
- the mode according to Item 93 is the image recording device according to any one of Items 89 to 92, wherein the high lightness has a coefficient having a sign different from that of the predetermined high lightness skin hue region.
- the lightness area of the hue area other than the skin color hue area is a predetermined high lightness area.
- the form according to Item 94 is the image recording device according to any one of Items 89 to 93, wherein the intermediate lightness has a coefficient having a sign different from that of the intermediate lightness area of the skin color hue area.
- the hue area of the lightness area other than the area is the hue area in the skin color hue area.
- Item 95 The form described in Item 95 is the image recording device according to any one of Items 89 to 93,
- the high lightness skin color hue region includes a lightness value range of 170 to 224 in the HSV color system.
- Item 96 is the image recording device according to any one of Items 89 to 92 and 94, wherein the intermediate lightness region has a lightness value of 85 to from HSV color system: A range of 169 areas is included.
- the form described in Item 97 is the image recording apparatus according to any one of Items 89 to 83 and 95, wherein the hue area other than the high lightness skin color hue area includes a blue hue area, At least one of the green hue regions is included.
- Item 98 The form described in Item 98 is the image recording apparatus according to any one of Items 89 to 92, 94, and 96, wherein the lightness area other than the intermediate lightness area is a shadow area.
- the form described in Item 99 is the image recording device according to Item 97, wherein the hue value of the blue hue region is in the range of 161 to 250 in the HSV color system, and The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- Item 101 The form described in Item 101 is the image recording device according to any one of Items 89 to 100, wherein the hue value of the flesh color hue region is 0 to 39 as a hue value of an HSV color system. And 330-3 It is in the range of 59.
- the mode described in Item 102 is the image recording device according to any one of Items 89 to 101, wherein the skin color hue area is divided into two areas by a predetermined conditional expression based on lightness and saturation. Is divided into
- the mode according to Item 103 is characterized in that, in the image recording device according to any one of Items 71 to 102, the image processing condition determining unit performs a gradation conversion process on the captured image. Are determined.
- the coefficient preset in accordance with the imaging conditions is obtained using a discriminant analysis method. It is the discrimination coefficient.
- Item 105 The form described in Item 105 is the image recording device described in Item 104,
- the coefficient preset according to the shooting conditions is a value of a discrimination coefficient adjusted so that a discriminant function satisfies a predetermined condition for a plurality of sample images prepared for each shooting condition.
- the captured image data is divided into a plurality of areas each including at least one of predetermined brightness and hue, and distance and brightness from the outer edge of the screen of the captured image data.
- the mode described in Item 107 is an image processing program according to Item 106,
- the captured image data is divided into a plurality of regions each having a predetermined combination of lightness and hue, and a ratio of the divided plurality of regions to the entire captured image data is indicated. Calculate the occupancy,
- a coefficient used for a coefficient used for a predetermined high lightness skin color hue area and a coefficient used for a coefficient used for a hue area other than the high lightness skin color hue area are different. Number or
- the index is calculated by using at least one of the coefficients having different signs between the coefficient used for the intermediate lightness area of the skin color hue area and the coefficient used for the lightness area other than the intermediate lightness area.
- the form described in Item 108 is the image processing program according to Item 107, wherein in the index calculation step, a coefficient used for a predetermined high lightness skin color hue region and a coefficient other than the high lightness skin color hue region are used. The signs of the coefficients used for the hue area are different.
- the form described in Item 109 is the image processing program according to Item 107, wherein in the index calculation step, a coefficient used for an intermediate lightness area of a skin color hue area and a coefficient used for a lightness area other than the intermediate lightness area are used. The signs of the coefficients are different.
- Item 110 The form described in Item 110 is the image processing program described in Item 107,
- a first index is calculated using a different sign coefficient between a predetermined high lightness skin color hue area and a hue area other than the high lightness skin hue area, and a first hue area of the skin hue area is calculated.
- a second index is calculated using coefficients of different signs in the intermediate brightness region and a brightness region other than the intermediate brightness region,
- the image processing condition is determined based on the calculated first index and second index.
- Item 111 The form described in Item 111 is characterized in that, in the image processing program according to any one of Items 107 to 110, the number of accumulated pixels is calculated for each predetermined hue and brightness of the captured image data. Including a histogram creation step of creating a dimensional histogram,
- the occupancy is calculated based on the created two-dimensional histogram.
- the form described in Item 112 is the image processing program according to any one of Items 107, 108, 110, and 111, wherein the image processing program has a coefficient whose sign is different from that of the predetermined high lightness skin color hue area.
- the lightness area of the hue area other than the high lightness skin color hue area is a predetermined high lightness area.
- Item 114 is the image processing program according to any one of Items 107, 108, and 110 to 112, wherein the high lightness skin color hue region has a lightness value of an HSV color system.
- An area ranging from 170 to 224 is included.
- Item 115 is the image processing program according to any one of Items 107, 109 to 111, and 113, wherein the intermediate lightness region has a lightness value of 85 to: It contains 169 regions.
- Item 116 is the image processing program according to any one of Items 107, 109, 110 to 112, and 114, wherein the hue area other than the high lightness skin color hue area is
- Item 117 In the image processing program according to any one of Items 107, 109 to 111, 113, and 115, a lightness region other than the intermediate lightness region is a shadow region.
- the form described in Item 118 is the image processing program according to Item 116, wherein the hue value of the blue hue region is in the range of 161 to 250 in the HSV color system, and the green color is The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- the form described in Item 119 is the image processing program according to Item 117, wherein the lightness value of the one shadow region is in the range of 26 to 84 in the lightness value of the HSV color system.
- Item 120 The form described in Item 120 is the image processing program according to any one of Items 107 to 119, wherein the hue value of the flesh color hue region is 0 to 39 in hue value of HSV color system.
- Item 121 is the image processing program according to any one of Items 107 to 120, wherein the skin color hue area is divided into two areas by a predetermined conditional expression based on lightness and saturation. Is divided into
- the form described in Item 122 is the image processing program according to Item 106, wherein, in the occupancy ratio calculating step, the photographed image data includes a combination of a distance from an outer edge of a screen of the photographed image data and a brightness. Divided into a plurality of predetermined areas, and An occupancy rate indicating a ratio of the photographic image data to the entire captured image data is calculated for each of the numbers of areas, and in the index calculation step, a coefficient having a different value is used according to a distance from the outer edge.
- Item 123 The form described in Item 123 is an image processing program according to Item 122,
- the occupancy is calculated based on the created two-dimensional histogram.
- the mode described in Item 124 is the image processing program described in Item 106,
- the captured image data is divided into a plurality of regions each having a predetermined combination of brightness and hue, and a ratio of the divided plurality of regions to the entire captured image data is indicated.
- the first occupancy is calculated, and the captured image data is divided into a plurality of predetermined regions each including a combination of a distance from the outer edge of the screen of the captured image data and brightness, and each of the plurality of divided regions is Calculating a second occupation ratio indicating a ratio of the photographic image data to the whole,
- an index for specifying a shooting scene is calculated by multiplying the calculated first occupancy and second occupancy by a coefficient set in advance according to shooting conditions.
- a first index is calculated using a different sign coefficient between a predetermined high lightness skin color hue area and a hue area other than the high lightness skin hue area, and a first hue area of the skin hue area is calculated.
- a second index is calculated using a coefficient with a different sign in the intermediate brightness area and a brightness area other than the intermediate brightness area, and a third index is calculated using a coefficient having a different value according to the distance of the outer edge force. Is calculated,
- the image processing condition determining step the image processing condition is determined based on the calculated first index, second index, and third index.
- Item 125 The mode described in Item 125 is the image processing program according to Item 124,
- the first index, the second index, and the third index are multiplied by a coefficient set in advance according to a shooting condition, and the fourth index is synthesized. And the fifth indicator are calculated,
- the image processing condition determining step the image processing condition is determined based on the calculated fourth index and fifth index.
- the form described in Item 126 is a two-dimensional histogram by calculating the cumulative number of pixels for each of the distance from the outer edge of the screen and the brightness of the captured image data in the image processing program described in Item 124 or 125. And a second step in which the second occupancy is calculated based on the created two-dimensional histogram.
- Item 127 The form described in Item 127 is the image processing program according to Item 124 or 125, wherein the two-dimensional histogram is created by calculating a cumulative number of pixels for each predetermined hue and brightness of the captured image data. Including the creation process,
- the first occupancy is calculated based on the created two-dimensional histogram.
- the form described in Item 128 is the image processing program according to any one of Items 124 to 127, wherein the image processing program has a coefficient whose sign is different from that of the predetermined high lightness skin color hue region.
- the lightness area of the hue area other than the skin tone hue area is a predetermined high lightness area.
- the form described in Item 129 is the image processing program according to any one of Items 124 to 127, wherein the intermediate lightness has a coefficient having a sign different from that of the intermediate lightness area of the skin color hue area.
- the hue area of the lightness area other than the area is the hue area in the skin color hue area.
- the form according to Item 130 is the image processing program according to any one of Items 124 to 128, wherein the high lightness skin color hue region has a lightness value of 170 to 224 in the HSV color system.
- the range of the range is included.
- the form according to Item 131 is the image processing program according to any one of Items 124 to 127 or 129, wherein: A range area is included.
- the mode described in Item 132 is the image processing method described in any one of Items 124 to 128 and 130.
- the hue area other than the high brightness skin color hue area includes at least one of a blue hue area and a green hue area.
- a lightness area other than the intermediate lightness area is a shadow area.
- the form described in Item 134 is the image processing program according to Item 132, wherein the hue value of the blue hue region is in the range of 161 to 250 in a HSV color system, and the green color is The hue value in the hue region is in the range of 40 to 160 in the HSV color system.
- Item 135 The form described in Item 135 is the image processing program according to Item 133, wherein the lightness value of the one shadow area is in the range of 26 to 84 in the lightness value of the HSV color system.
- Item 136 The form described in Item 136 is the image processing program according to any one of Items 124 to 135, wherein the hue value of the skin color hue region is 0 to 39 in the HSV color system.
- Item 137 The form described in Item 137 is characterized in that, in the image processing program according to any one of Items 124 to 136, the skin color hue area is divided into two areas by a predetermined conditional expression based on lightness and saturation. Is divided into
- the form described in Item 138 is an image processing program according to any one of Items 106 to 147, wherein the image processing condition determining step performs a gradation conversion process on the captured image. Conditions are determined.
- the coefficient preset according to the imaging condition is obtained by using a discriminant analysis method. This is the discrimination coefficient.
- the mode described in Item 140 is the image processing program described in Item 139,
- the coefficient preset according to the shooting conditions is a value of a discrimination coefficient adjusted so that a discriminant function satisfies a predetermined condition for a plurality of sample images prepared for each shooting condition.
- FIG. 1 is a perspective view showing an external configuration of an image recording apparatus 1 according to an embodiment of the present invention.
- the image recording apparatus 1 has a magazine loading section 3 for loading a photosensitive material on one side surface of a housing 2. Inside the housing 2, there are provided an exposure processing section 4 for exposing the photosensitive material, and a print making section 5 for developing and drying the exposed photosensitive material to make a print.
- the other side of the housing 2 is provided with a tray 6 for discharging the print created by the print creation unit 5.
- a CRT (Cathode Ray Tube) 8 as a display device, a film scanner unit 9 for reading a transparent original, a reflective original input device 10, and an operation unit 11 are provided on the upper part of the housing 2.
- the CRT 8 constitutes display means for displaying an image of the image information to be printed on the screen.
- the housing 2 is provided with an image reading unit 14 capable of reading image information recorded on various digital recording media, and an image writing unit 15 capable of writing (outputting) image signals on various digital recording media.
- a control unit 7 for centrally controlling these units is provided inside the housing 2.
- the image reading unit 14 is provided with a PC card adapter 14a and a floppy (registered trademark) disk adapter 14b, so that the PC card 13a and the floppy (registered trademark) disk 13b can be inserted.
- the PC card 13a has, for example, a memory in which a plurality of frame image data captured by a digital camera is recorded.
- a plurality of frame image data captured by a digital camera is recorded on the floppy disk 13b.
- a recording medium on which frame image data is recorded other than the PC card 13a and the floppy (registered trademark) disk 13b for example, a multimedia card (registered trademark), a memory stick (registered trademark), MD data, CD-ROM, etc. There is.
- the image writing unit 15 is provided with a floppy (registered trademark) disk adapter 15a, an MO adapter 15b, and an optical disk adapter 15c. It has become.
- the optical disk 16c includes a CD-R, a DVD-R and the like.
- the operation unit 11, the CRT 8, the film scanner unit 9, the reflection original input device 10, and the image reading unit 14 are configured to be integrally provided in the housing 2. Any of four or more of the above may be provided separately.
- a photosensitive material is exposed and developed to produce a print.
- the print creation method is not limited to this, and for example, a method such as an ink jet method, an electrophotographic method, a heat-sensitive method, or a sublimation method may be used.
- FIG. 2 shows a main configuration of the image recording apparatus 1.
- the image recording apparatus 1 includes a control unit 7, an exposure processing unit 4, a print generation unit 5, a film scanner unit 9, a reflection document input unit 10, an image reading unit 14, a communication unit (input) 32, It comprises an image writing unit 15, a data storage unit 71, a template storage unit 72, an operation unit 11, a CRT 8, and a communication unit (output) 33.
- the control unit 7 is configured by a microcomputer, and includes various control programs stored in a storage unit (not shown) such as a ROM (Read Only Memory), a CPU (Central Processing Unit) (not shown), The operation of each unit constituting the image recording apparatus 1 is controlled by the cooperation of the above.
- the control section 7 has an image processing section 70 according to the image processing apparatus of the present invention, and receives a signal from the film scanner section 9 or the reflection document input apparatus 10 based on an input signal (command information) from the operation section 11.
- the read image signal, the image signal read from the image reading unit 14, and the external device power are also subjected to image processing for the image signal input via the communication unit 32 to form image information for exposure, and Output to processing unit 4.
- the image processing unit 70 performs conversion processing according to the output form on the image signal that has been subjected to the image processing, and outputs the result.
- the output destination of the image processing unit 70 includes the CRT 8, the image writing unit 15, the communication means (output) 33, and the like.
- the exposure processing section 4 exposes the photosensitive material to an image, and outputs the photosensitive material to the print creating section 5.
- the print creating section 5 develops the exposed photosensitive material and dries it to create prints Pl, P2 and P3.
- Print P1 is a print of service size, high-definition size, panorama size, etc.
- Print P2 is an A4 size print
- print P3 is a business card size print.
- the exposure processing section 4 and the print creation section 5 may be combined into an image data creation section.
- the film scanner section 9 reads a frame image recorded on a transparent original such as a developed negative film N or a reversal film captured by an analog camera, and acquires a digital image signal of the frame image.
- the reflective original input device 10 is It reads the image on the printout (photo prints, documents, various types of printed materials) and acquires digital image signals.
- the image reading section 14 reads out the frame image information recorded on the PC card 13a or the floppy (registered trademark) disk 13b and transfers it to the control section 7.
- the image reading unit 14 has a PC card adapter 14a, a floppy (registered trademark) disk adapter 14b, and the like as image transfer means 30.
- the image reading section 14 reads frame image information recorded on the PC card 13a inserted into the PC card adapter 14a or the floppy disk 13b inserted into the floppy disk adapter 14b.
- the PC card adapter 14a for example, a PC card reader or a PC card slot is used.
- the communication means (input) 32 receives an image signal representing a captured image and a print command signal from another computer in the facility where the image recording apparatus 1 is installed, or a distant computer via the Internet or the like. I do.
- the image writing unit 15 includes a floppy (registered trademark) disk adapter 15a, an MO adapter 15b, and an optical disk adapter 15c as the image transport unit 31.
- the image writing unit 15 sends the floppy (registered trademark) disk 16a inserted into the floppy (registered trademark) disk adapter 15a and the MO inserted into the MO adapter 15b according to the write signal input from the control unit 7.
- 16b the image signal generated by the image processing method of the present invention is written on the optical disk 16c inserted into the optical disk adapter 15c.
- the data storage unit 71 stores the image information and the corresponding order information (information on how many prints are to be created from which frame image, information on the print size, etc.) and sequentially stores them.
- the template storage means 72 stores data of at least one template for setting a combined area with a background image, an illustration image, and the like, which are sample image data corresponding to the sample identification information Dl, D2, and D3.
- a predetermined template is selected from a plurality of templates stored in the template storage means 72 and set by the operation of the operator, the frame image information is synthesized by the selected template, and the specified sample identification information Dl, D2, The sample image data selected based on D3 is combined with the image data and / or character data based on the order, and a print based on the designated sample is created. This Is synthesized by the well-known chromaki method.
- the sample identification information Dl, D2, and D3 that specify a print sample is configured so that the operation unit 11 is also used to input the sample identification information. Since it is recorded in a file, it can be read by reading means such as OCR. Alternatively, the input can be made by an operator's keyboard operation.
- the sample image data is recorded corresponding to the sample identification information D1 specifying the print sample, the sample identification information D1 specifying the print sample is input, and the input sample identification information is input.
- D1 to select sample image data combine the selected sample image data with the image data and / or character data based on the order, and create a print based on the specified sample image. Users can order prints by actually getting the samples and responding to the diverse requirements of a wide range of users.
- the first sample identification information D2 designating the first sample summaries and the image data of the first sample summaries are stored, and the second sample identification information D3 designating the second samples is designated with the second sample identification information D3.
- the image data of the second sample is stored, and the sample data selected based on the specified first and second sample identification information D2, D3, and the image data and / or character data based on the order are stored. Since images are combined and a print based on a designated sample is created, a wide variety of images can be combined, and a print can be created that meets the needs of a wider range of users.
- the operation unit 11 has information input means 12.
- the information input means 12 is composed of, for example, a touch panel or the like, and outputs a press signal of the information input means 12 to the control section 7 as an input signal.
- the operation unit 11 may be configured to include a keyboard, a mouse, and the like.
- the CRT 8 displays image information and the like according to the display control signal input from the control unit 7.
- the communication means (output) 33 converts the image signal representing the captured image subjected to the image processing of the present invention and the accompanying order information into another core in the facility where the image recording apparatus 1 is installed. It is transmitted to a remote computer via a computer or the Internet.
- the image recording apparatus 1 includes images and image originals of various digital media.
- Image input means for capturing image information obtained by split photometry of the image
- image processing means image output means for displaying, printing out, and writing the processed image to an image recording medium, and a facility through a communication line.
- FIG. 3 shows the internal configuration of the image processing unit 70.
- the image processing unit 70 includes an image adjustment processing unit 701, a film scan data processing unit 702, a reflection original scan data processing unit 703, an image data format decoding processing unit 704, a template processing unit 705, and CRT specific processing.
- the processing unit 706 includes a printer-specific processing unit A707, a printer-specific processing unit B708, and an image data format creation processing unit 709.
- the film scan data processing unit 702 performs a calibration operation unique to the film scanner unit 9, a negative / positive inversion (in the case of a negative original), a dust removal, a contrast adjustment, and the like on the image data input from the film scanner unit 9. Processing such as grain noise removal and sharpening enhancement is performed, and the processed image data is output to the image adjustment processing unit 701.
- the image adjustment processing unit 701 also outputs film size, negative / positive type, information on the main subject recorded optically or magnetically on the film, information on shooting conditions (for example, information described in APS), and the like. .
- the reflection original scan data processing unit 703 performs a calibration operation unique to the reflection original input device 10, negative / positive inversion (in the case of a negative original), dust removal, and contrast adjustment for the image data input from the reflection original input device 10.
- the image processing section 701 performs processing such as noise removal and sharpening enhancement, and outputs the processed image data to the image adjustment processing section 701.
- the image data format decryption processing unit 704 converts the image data input from the image transfer means 30 and Z or the communication means (input) 32 into a compression code, if necessary, according to the data format of the image data.
- the image processing unit 70 performs processing such as restoration and conversion of a color data expression method, converts the data into a data format suitable for the operation in the image processing unit 70, and outputs the data to the image adjustment processing unit 701.
- the image data format decoding processing unit 704 detects the designated information. Is output to the image adjustment processing unit 701.
- Information on the size of the output image is embedded in the header information and tag information of the image data acquired by the image transfer means 30.
- the image adjustment processing unit 701 includes a film scanner unit 9, a reflection original input device 10, an image transfer unit 30, a communication unit (input) 32, and a template based on a command from the operation unit 11 or the control unit 7.
- the image data received from the image processing unit 705 is subjected to image processing described below (see FIGS. 5, 6, and 12) to convert the digital image data for image formation optimized for viewing on an output medium.
- the processing is performed so as to obtain the optimal color reproduction within the color gamut of the sRGB standard.
- processing is performed so that optimum color reproduction is obtained within the color gamut of silver halide photographic paper.
- color gamut compression it also includes gradation compression from 16 bits to 8 bits, reduction of the number of output pixels, processing corresponding to output characteristics (LUT) of an output device, and the like.
- tone compression processing such as noise suppression, sharpening, gray balance adjustment, saturation adjustment, or overlaying and burning is performed.
- the image adjustment processing unit 701 includes a scene determination unit 710 and a gradation conversion unit 711.
- FIG. 4 shows the internal configuration of the scene determination unit 710.
- the scene determining unit 710 includes a ratio calculating unit 712, an index calculating unit 713, and an image processing condition determining unit 714, as shown in FIG.
- the ratio calculation unit 712 includes a color system conversion unit 715, a histogram creation unit 716, and an occupancy calculation unit 717.
- the color system conversion unit 715 converts the RGB (Red, Green, Blue) values of the captured image data into the HSV color system.
- the HSV color system expresses image data with three elements: hue (Hue), saturation (Saturation), and lightness (Value or Brightness), and is based on the color system proposed by Munsell. It was invented.
- “brightness” means “brightness” which is generally used unless otherwise specified.
- V (0 to 255) of the HSV color system is used as “brightness”, but a unit system representing the brightness of any other color system may be used. that time, Needless to say, the values of various coefficients and the like described in the present embodiment are calculated again.
- the captured image data in the present embodiment is image data in which a person is a main subject.
- Histogram creating section 716 creates a two-dimensional histogram by dividing the captured image data into regions each having a predetermined combination of hue and brightness, and calculating the cumulative number of pixels for each of the divided regions. Further, the histogram creating unit 716 divides the captured image data into a predetermined area including a combination of the distance from the outer edge of the screen and the brightness of the captured image data, and calculates the cumulative number of pixels for each of the divided areas. This creates a two-dimensional histogram. The 3D histogram is created by dividing the captured image data into regions consisting of a combination of the distance from the outer edge of the screen, brightness and hue of the captured image data, and calculating the cumulative number of pixels for each divided region. You may make it. In the following, it is assumed that a method of creating a two-dimensional histogram is adopted.
- the occupancy calculation unit 717 indicates the ratio of the cumulative number of pixels calculated by the histogram creation unit 716 to the total number of pixels (the entire captured image data) for each region divided by the combination of brightness and hue. Calculate the first occupancy (see Table 1). In addition, the occupancy calculation unit 717 calculates the total number of pixels (the number of captured image data) calculated by the histogram creation unit 716 for each region divided by the combination of the distance from the outer edge of the screen and the brightness of the captured image data. Calculate the second occupancy rate (see Table 4), which indicates the percentage of the total.
- the index calculation unit 713 multiplies the first occupancy calculated for each area by the occupancy calculation unit 717 by a first coefficient (see Table 2) preset according to the imaging conditions. By calculating the sum, the index 1 for specifying the shooting scene is calculated.
- the shooting scene indicates a light source condition for shooting a subject, such as a direct light, a backlight, a strobe, and the like.
- Index 1 indicates the characteristics of flash photography such as indoor photography, close-up photography, and high brightness of the complexion, and only images that should be identified as “stroke” are separated from other photography scenes (light source conditions). It is for doing.
- the index calculating unit 713 uses coefficients of different signs in a predetermined high lightness skin color hue area and a hue area other than the high lightness skin color hue area.
- the predetermined high lightness skin color hue area includes an area of 170 to 224 in the lightness value of the HSV color system. I will.
- the hue area other than the predetermined high lightness skin color hue area includes at least one of a blue hue area (hue value 16 :! to 250) and a green hue area (hue value 40 to 160). included.
- the index calculating unit 713 adds a second coefficient (see Table 3) preset according to the imaging condition to the first occupancy calculated for each area in the occupancy calculating unit 717.
- the index 2 for specifying the shooting scene is calculated by multiplying and taking the sum.
- Index 2 shows the characteristics of backlight shooting such as outdoor shooting degree, sky blue high brightness, face color low brightness etc. in a composite manner.Only the images that should be determined as ⁇ backlight '' are captured in other shooting scenes (light source conditions). This is to separate them from).
- the index calculating unit 713 determines different codes for the intermediate lightness area of the skin color hue area (hue values 0 to 39, 330 to 359) and the lightness area other than the intermediate lightness area. Use the coefficient of.
- the intermediate lightness area of the skin color hue area includes an area having a lightness value of 85 to 169.
- the lightness area other than the intermediate lightness area includes, for example, a shadow area (lightness values 26 to 84).
- the index calculating unit 713 adds a third coefficient (see Table 5) preset in accordance with the imaging conditions to the second occupancy calculated for each area by the occupancy calculating unit 717.
- the index 3 for specifying the shooting scene is calculated by multiplying and taking the sum.
- the index 3 indicates the difference between the backlight and the strobe in the brightness relationship between the center and the outside of the screen of the captured image data, and quantitatively indicates only the image that should be distinguished from the backlight or the strobe.
- the index calculation unit 713 uses coefficients of different values according to the distance of the captured image data from the outer edge of the screen.
- the index calculating unit 713 calculates the index 4 by multiplying the index 1 and the index 3 by a coefficient preset according to the imaging condition and combining them. Further, the index calculating unit 713 calculates the index 5 by multiplying each of the index 1, the index 2 and the index 3 by a coefficient preset according to the photographing condition, and combining them. A specific method of calculating the indexes 1 to 5 in the index calculation unit 713 will be described in detail in the operation description of the present embodiment described later.
- the image processing condition determining unit 714 determines the index 4 and the finger calculated by the index calculating unit 713.
- the shooting scene (light source condition) is determined based on the value of the target 5, and the determination result is used as the index 4 and the index 5 calculated by the index calculator 713, and other various parameters (such as the average luminance of the captured image data).
- Image processing conditions (gradation conversion processing conditions) for the captured image data are determined based on the image data.
- the gradation conversion unit 711 performs gradation conversion of the captured image data according to the image processing conditions (gradation conversion processing conditions) determined by the image processing condition determination unit 714.
- Template processing unit 705 reads out predetermined image data (template) from template storage unit 72 based on a command from image adjustment processing unit 701, and combines the image data to be subjected to image processing with the template. The template processing is performed, and the image data after the template processing is output to the image adjustment processing unit 701.
- the CRT-specific processing unit 706 performs processing such as a change in the number of pixels or color matching as necessary on the image data input from the image adjustment processing unit 701, to obtain information that needs to be displayed such as control information. Output the synthesized image data for display to CRT8.
- the printer-specific processing unit A707 performs printer-specific calibration processing, color matching, pixel number change, etc., as necessary, and outputs the processed image data to the exposure processing unit 4.
- a printer-specific processing unit B708 is provided for each connected printer.
- the printer-specific processing unit B708 performs printer-specific calibration processing, color matching, and pixel number change processing, and outputs processed image data to the external printer 51.
- the image data format creation processing unit 709 converts the image data input from the image adjustment processing unit 701 into various general-purpose image formats, such as JPEG, TIFF, and Exif, as necessary. And outputs the processed image data to the image transport unit 31 and the communication means (output) 33.
- This is a division provided to help understand the function of the processing unit 70, and does not necessarily have to be implemented as a physically independent device.For example, even if it is implemented as a division of the type of software processing by a single CPU, Yore,
- the captured image data is divided into predetermined image regions, and an occupancy ratio calculation process is performed to calculate an occupancy ratio indicating the ratio of each divided region to the entire captured image data ( Step Sl).
- an occupancy ratio calculation process is performed to calculate an occupancy ratio indicating the ratio of each divided region to the entire captured image data ( Step Sl). The details of the occupancy calculation process will be described later with reference to FIGS.
- step S2 an index (quantitatively representing a light source condition) for specifying a shooting scene based on the occupancy calculated by the ratio calculation unit 712 and a coefficient preset in accordance with the shooting condition (index 1) To 5) are calculated (step S2).
- the index calculation process in step S2 will be described later in detail.
- step S3 the shooting scene is determined based on the index calculated in step S2, and image processing conditions (gradation conversion processing conditions) for the captured image data are determined according to the determination result (step S3).
- image processing conditions gradation conversion processing conditions
- the RGB values of the captured image data are converted to the HSV color system (step S10).
- Figure 7 shows an example of a conversion program (HSV conversion program) that obtains hue, saturation, and lightness values by converting RGB to the HSV color system, using program code (c language).
- HSV conversion program shown in Fig. 7
- the values of digital image data, which is input image data are defined as InR, InG, and InB
- the calculated hue values are defined as OutH
- the scale is defined as 0 to 360.
- the degree value as OutS
- the brightness value as OutV
- the unit 0 to 255. Yes.
- the captured image data is divided into regions each having a predetermined combination of brightness and hue, and a two-dimensional histogram is created by calculating the cumulative number of pixels for each of the divided regions (step Sll).
- a two-dimensional histogram is created by calculating the cumulative number of pixels for each of the divided regions.
- the hue (H) is a skin hue area (HI and H2) with a hue value of 0 to 39 and 330 to 359, a hue value of 40 to: a green hue area (H3) with a hue value of 160, and a blue hue with a hue value of 161 to 250. It is divided into four areas: area (H4) and red hue area (H5).
- the red hue region (H5) is not used in the following calculations based on the finding that it does not contribute much to the discrimination of the shooting scene.
- the skin color hue area is further divided into a skin color area (HI) and another area (H2).
- HI skin color area
- H2 another area
- the hue '(H) satisfying the following equation (1) is defined as the skin color area (HI), and the area not satisfying the equation (1). Is (H2).
- Hue '(H) Hue (H) + 60 (when 0 ⁇ Hue (H) + 300),
- Hue '(H) Hue (H) -300 (when 300 ⁇ Hue (H) + 360),
- Brightness Y InRX0.30 + InGXO.59 + InBXO.11,
- a first occupation ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step S12).
- the occupancy ratio calculation processing ends. Assuming that the first occupancy rate calculated in a divided area composed of a combination of the lightness area vi and the hue area Hj is Rij, the first occupancy rate in each divided area is expressed as shown in Table 1.
- Table 2 shows the accuracy as strobe shooting, that is, the first coefficient necessary for calculating the index 1 that quantitatively indicates the brightness state of the face area at the time of strobe shooting, obtained by the discriminant analysis. This is shown for each divided area.
- the coefficients of each divided region shown in Table 2 are weighting factors for multiplying the first occupancy Rij of each divided region shown in Table 1.
- the coefficient of each divided region can be obtained, for example, by the following procedure.
- a two-dimensional histogram is created by preparing a plurality of sets of image data for each shooting condition and calculating the accumulated number of pixels for each image data for each divided region having a predetermined combination of brightness and hue. I do. Then, the occupancy rate indicating the ratio of the cumulative number of pixels to the total number of pixels is calculated for each divided area. The calculated occupancy is used as a discriminant of a discriminant function in the discriminant analysis method.
- a discriminant function including the discriminant factor and the discriminant coefficient and an expected value by which the image data can be grouped for each imaging condition by the discriminant function are determined in advance. Then, by adjusting the discrimination coefficient, a discrimination coefficient is obtained so that each image data achieves the expected value. The occupancy of each divided area is multiplied by the discrimination coefficient thus obtained. To be used as the weighting factor.
- the adjusted discrimination coefficient is an appropriate value is determined by newly calculating the above-mentioned occupancy rate for an unknown sample image, and using the calculated occupancy rate (discrimination factor) and the adjusted discrimination coefficient as a discrimination function. It can be confirmed by applying.
- FIG. 8 shows a plane of lightness (V) —hue (H).
- V plane of lightness
- H hue area
- a positive (+) coefficient is used for the first occupancy calculated from the region (rl) distributed in the high lightness skin color hue region in FIG.
- a negative (-) coefficient is used for the first occupancy calculated from the hue area (r2).
- Figure 10 shows the first coefficient in the skin color area (HI) and the first coefficient in the other area (green hue area (H3)) as a curve (coefficient curve) that changes continuously over the entire lightness. ).
- the sign of the first coefficient in (H3)) is negative (1), indicating that the signs of the two coefficients are different.
- H2 area sum R12X0.0 + R22X8.6 + (omitted)
- H3 area R13X0.0 + R23X (— 6.3) + (omitted)
- H4 area sum R14X0.0 + R24X (— 1.8) + (omitted)
- the index 1 is calculated by using the sum of the H1 to H4 regions shown in equations (2— :! to (2-4) as shown in equation (3). Is defined as
- Index 1 sum of H1 area + sum of H2 area + sum of H3 area + sum of H4 area + 4.424 (3)
- Table 3 shows the accuracy of backlight imaging obtained by discriminant analysis, that is, the second coefficient required to calculate the index 2 that quantitatively indicates the brightness state of the face area during backlight imaging, for each divided area. Show.
- the coefficients of each divided region shown in Table 3 are weighting factors for multiplying the first occupancy Rij of each divided region shown in Table 1.
- Fig. 9 shows the lightness (v) -hue (H) plane.
- a negative (1) coefficient is used for the occupancy calculated from the area (r4) distributed in the intermediate brightness of the skin color hue area in FIG. 9, and the low brightness (shadow) area of the skin hue area is used.
- a positive (+) coefficient is used for the occupancy calculated from (r3).
- FIG. 11 shows the second coefficient in the flesh-tone area (HI) as a curve (coefficient curve) that changes continuously over the entire brightness. According to Table 3 and FIG.
- the sign of the second coefficient in the intermediate lightness region of the lightness value of 85 to 169 (v4) in the skin color hue region is negative (1), and the lightness value is 26 to 84 (
- the sign of the second coefficient in the low-brightness (shadow) region of v2, v3) is positive (+), indicating that the sign of the coefficient is different in both regions.
- H4 area sum R14X0.0 + R24X (_5.1) + (omitted)
- Index 2 sum of H1 areas + sum of H2 areas + sum of H3 areas + sum of H4 areas + 1.554
- the RGB values of the captured image data are converted to the HSV color system (step S20).
- the captured image data is divided into regions each consisting of a combination of the distance from the outer edge of the captured image screen and brightness, and a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step S21).
- a two-dimensional histogram is created by calculating the cumulative number of pixels for each divided region (step S21).
- the area division of the captured image data will be described in detail.
- Figs. 13 (a) to 13 (d) show four regions nl to n4 divided according to the distance of the captured image data from the outer edge of the screen.
- the area nl shown in FIG. 13A is the outer frame
- the area n2 shown in FIG. 13B is the area inside the outer frame
- an area n4 shown in FIG. 13D is an area at the center of the captured image screen.
- a second occupancy ratio indicating the ratio of the cumulative number of pixels calculated for each divided region to the total number of pixels (the entire captured image) is calculated (step S22).
- the occupancy ratio calculation processing ends. Assuming that the second occupancy rate calculated in the divided area composed of the combination of the brightness area vi and the screen area nj is Qij, the second occupancy rate in each divided area is shown in Table 4.
- Table 5 shows a third coefficient required for calculating the index 3 for each divided region.
- the coefficients of each divided region shown in Table 5 are weighting factors for multiplying the second occupancy Qij of each divided region shown in Table 4, and are obtained by discriminant analysis.
- FIG. 14 shows the third coefficient in the screen areas nl to n4 as a curve (coefficient curve) that continuously changes over the entire brightness.
- n3 area Q13X24.6 + Q23X12. 1+ (omitted)
- n4 area Q14X1.5 + Q24X (-32. 9) + (omitted)
- Index 3 is defined as in equation (7) using the sum of the N1 to H4 regions shown in equations (6— :!) to (6-4).
- Index 3 sum of nl areas + sum of n2 areas + sum of n3 areas + sum of n4 areas 12.6201
- Index 4 is defined using Equations 1 and 3 as in Equation (8)
- Index 5 is defined using Equations 1 to 3 as in Equation (9).
- the weighting factors to be multiplied by each index in Expressions (8) and (9) are set in advance according to the imaging conditions.
- Fig. 15 shows that the images 4 and 5 were taken for a total of 180 digital image data by taking 60 images under each of the forward light, backlight and strobe light conditions, and the indices 4 and 5 were obtained under each light condition. The value of 5 is plotted. According to Fig. 15, when the value of index 4 is greater than 0.5, there are many strobe scenes. When the value of index 4 is 0.5 or less, and when the value of index 5 is greater than 0.5, the backlight You can see that there are many Table 6 shows the content of discrimination of shooting scenes (light source conditions) based on the values of indices 4 and 5.
- the shooting scene (light source condition) can be quantitatively determined based on the values of the indices 4 and 5.
- Reproduction target correction value luminance reproduction target value (30360)-P4
- a CDF cumulative density function
- the maximum and minimum values are determined from the obtained CDF.
- the maximum and minimum values are obtained for each RGB.
- the obtained maximum value and minimum value for each RGB are Rmax, Rmm, Gmax, Gmm, Bmax, and Bmm, respectively.
- normalized image data for any pixel (Rx, Gx, Bx) of the captured image data is calculated.
- Rx normalized data in R plane is R, Gx point in G plane
- G be the normalized data of B and B be the normalized data of Bx on the B plane.
- the normalized data R, G, and B are expressed as in equations (10) to (12), respectively.
- N (B + G + R) / 3 (13)
- FIG. 16A shows a frequency distribution (histogram) of the luminance of the RGB pixels before normalization.
- the horizontal axis represents luminance
- the vertical axis represents pixel frequency. This histogram is created for each RGB.
- normalization is performed on the captured image data for each plane using Expressions (10) to (12).
- FIG. 16 (b) shows a luminance histogram calculated by the equation (13). Since the captured image data is normalized to 65535, each pixel takes an arbitrary value between the maximum value of 65535 and the minimum value of 0.
- FIG. 16 (c) A frequency distribution as shown in FIG. 16 (c) is obtained.
- the horizontal axis is the block number (luminance) and the vertical axis is the frequency.
- FIG. 17 (c) shows a region whose frequency is higher than a predetermined threshold. This is because if there is a part with extremely high frequency, the data in this part strongly influences the average luminance of the entire captured image, so that erroneous correction is likely to occur. Therefore, as shown in FIG. 17 (c), the number of pixels equal to or larger than the threshold is limited in the luminance histogram.
- FIG. 17D shows a luminance histogram after the pixel number limiting process is performed.
- the block numbers of the luminance histogram (Fig. 17 (d)) obtained by deleting the high luminance region and the low luminance region from the normalized luminance histogram and further restricting the number of accumulated pixels,
- the parameter P2 is obtained by calculating the average value of the luminance based on each frequency.
- the parameter P1 is the average value of the luminance of the entire captured image data
- the parameter P3 is the average value of the luminance of the skin color region (HI) in the captured image data.
- the key correction value of the parameter P7 and the luminance correction value 2 of the parameter P8 are defined as in equations (14) and (15), respectively.
- offset correction for matching the parameter P1 with P5 is performed by the following equation (16).
- RGB value of output image RGB value of input image + P6 (16)
- the gradation corresponding to the parameter P7 (key correction value) shown in equation (14) is selected from the preset gradation conversion curves (correction curves) L1 to L5 shown in FIG. Select a conversion curve.
- the correspondence between the value of parameter P7 and the selected gradation conversion curve is shown below.
- offset correction (parallel shift of 8-bit value) is performed by equation (17).
- RGB value of output image RGB value of input image + P9 (17)
- the above-described image processing conditions are changed from 16 bits to 8 bits.
- the shooting scene (light source conditions (direct light, backlight, strobe, etc.)) is quantitatively determined from the captured image data in which a person is the main subject.
- the index shown in (1) By calculating the index shown in (1) and determining the image processing conditions for the captured image data based on the calculated index, it becomes possible to appropriately correct the brightness of the face area of the subject.
- index 1 that quantitatively indicates the accuracy of flash photography, that is, the brightness state of the face area at the time of flash photography, it is possible to appropriately correct the high brightness area of the face area. it can.
- the accuracy as backlight shooting that is, the index 2 that quantitatively indicates the brightness state of the face region at the time of backlight shooting
- the low brightness region of the face region can be appropriately corrected.
- the brightness of the face area can be appropriately corrected.
- the index 3 derived from the compositional element of the captured image data in addition to the index 1 and the index 2, it is possible to improve the accuracy of determining the captured scene.
- a face image may be detected from photographed image data, a photographing scene may be determined based on the detected face image, and image processing conditions may be determined.
- the shooting scene format Alternatively, Exif (Exchangeable Image File Format) information may be used. The use of Exif information makes it possible to further improve the accuracy of determining a shooting scene.
- the shooting scene is determined based on the indices 4 and 5.
- an additional index may be added to determine the shooting scene in a three-dimensional space.
- Good les for example, in a flash scene, tone adjustment is performed to darken the entire image according to index 4, so if the under-shooting scene is incorrectly determined to be a flash scene, the image may be further enlarged. You. In order to avoid this, it is preferable to set the average luminance P3 of the flesh color area as the index 6 so as to determine whether the scene is a flash scene or an under-shooting scene.
- an index that quantitatively indicates a shooting scene (light source conditions (direct light, backlight, strobe, etc.)) is calculated, and image processing for the shot image data is performed based on the calculated index.
- the accuracy as backlight shooting that is, a second index that quantitatively indicates the brightness state of the face region at the time of backlight shooting
- the low brightness region of the face region is appropriately corrected. That can be S.
- the brightness area of the face area can be appropriately corrected.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
- Color Image Communication Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006513530A JPWO2005112428A1 (ja) | 2004-05-18 | 2005-05-09 | 画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-147797 | 2004-05-18 | ||
JP2004147797 | 2004-05-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005112428A1 true WO2005112428A1 (ja) | 2005-11-24 |
Family
ID=35374857
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/008412 WO2005112428A1 (ja) | 2004-05-18 | 2005-05-09 | 画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20050259282A1 (ja) |
JP (1) | JPWO2005112428A1 (ja) |
WO (1) | WO2005112428A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9311888B2 (en) | 2012-12-17 | 2016-04-12 | Samsung Display Co., Ltd. | Image processing device, image processing method and program |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070146494A1 (en) * | 2005-12-22 | 2007-06-28 | Goffin Glen P | Video telephony system and a method for use in the video telephony system for improving image quality |
US8014602B2 (en) * | 2006-03-29 | 2011-09-06 | Seiko Epson Corporation | Backlight image determining apparatus, backlight image determining method, backlight image correction apparatus, and backlight image correction method |
US7916943B2 (en) * | 2006-06-02 | 2011-03-29 | Seiko Epson Corporation | Image determining apparatus, image determining method, image enhancement apparatus, and image enhancement method |
US7916942B1 (en) | 2006-06-02 | 2011-03-29 | Seiko Epson Corporation | Image determining apparatus, image enhancement apparatus, backlight image enhancement apparatus, and backlight image enhancement method |
TW201106676A (en) * | 2009-08-04 | 2011-02-16 | Pacific Image Electronics Co Ltd | Double-light sources optical scanning device and method of using the same |
EP3100105B1 (en) | 2014-01-30 | 2020-05-27 | Hewlett-Packard Development Company, L.P. | Method and system for providing a self-adaptive image |
CN105872351A (zh) * | 2015-12-08 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | 逆光场景的照片拍摄方法和装置 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11196324A (ja) * | 1997-12-26 | 1999-07-21 | Fuji Photo Film Co Ltd | 画像出力方法および装置 |
JP2000148980A (ja) * | 1998-11-12 | 2000-05-30 | Fuji Photo Film Co Ltd | 画像処理方法、画像処理装置及び記録媒体 |
JP2001222710A (ja) * | 2000-02-09 | 2001-08-17 | Fuji Photo Film Co Ltd | 画像処理装置および画像処理方法 |
JP2002232728A (ja) * | 2001-01-30 | 2002-08-16 | Minolta Co Ltd | 画像処理プログラム、画像処理プログラムを記録したコンピュータ読み取り可能な記録媒体、画像処理装置および画像処理方法 |
JP2002247361A (ja) * | 2001-02-14 | 2002-08-30 | Ricoh Co Ltd | 画像処理装置、画像処理方法およびその方法を実施するためのプログラムを記録した記録媒体 |
JP2003110932A (ja) * | 2001-09-28 | 2003-04-11 | Mitsubishi Electric Corp | 明度調整方法および撮像装置 |
JP2004088408A (ja) * | 2002-08-27 | 2004-03-18 | Minolta Co Ltd | デジタルカメラ |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3408094B2 (ja) * | 1997-02-05 | 2003-05-19 | キヤノン株式会社 | 画像処理装置及びその方法 |
-
2005
- 2005-05-09 JP JP2006513530A patent/JPWO2005112428A1/ja active Pending
- 2005-05-09 WO PCT/JP2005/008412 patent/WO2005112428A1/ja active Application Filing
- 2005-05-10 US US11/125,638 patent/US20050259282A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11196324A (ja) * | 1997-12-26 | 1999-07-21 | Fuji Photo Film Co Ltd | 画像出力方法および装置 |
JP2000148980A (ja) * | 1998-11-12 | 2000-05-30 | Fuji Photo Film Co Ltd | 画像処理方法、画像処理装置及び記録媒体 |
JP2001222710A (ja) * | 2000-02-09 | 2001-08-17 | Fuji Photo Film Co Ltd | 画像処理装置および画像処理方法 |
JP2002232728A (ja) * | 2001-01-30 | 2002-08-16 | Minolta Co Ltd | 画像処理プログラム、画像処理プログラムを記録したコンピュータ読み取り可能な記録媒体、画像処理装置および画像処理方法 |
JP2002247361A (ja) * | 2001-02-14 | 2002-08-30 | Ricoh Co Ltd | 画像処理装置、画像処理方法およびその方法を実施するためのプログラムを記録した記録媒体 |
JP2003110932A (ja) * | 2001-09-28 | 2003-04-11 | Mitsubishi Electric Corp | 明度調整方法および撮像装置 |
JP2004088408A (ja) * | 2002-08-27 | 2004-03-18 | Minolta Co Ltd | デジタルカメラ |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9311888B2 (en) | 2012-12-17 | 2016-04-12 | Samsung Display Co., Ltd. | Image processing device, image processing method and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2005112428A1 (ja) | 2008-03-27 |
US20050259282A1 (en) | 2005-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2006319714A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
US20050141002A1 (en) | Image-processing method, image-processing apparatus and image-recording apparatus | |
WO2006123492A1 (ja) | 画像処理方法、画像処理装置、撮像装置及び画像処理プログラム | |
WO2005112428A1 (ja) | 画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム | |
JP2003283731A (ja) | 画像入力装置及び画像出力装置並びにこれらから構成される画像記録装置 | |
JP2006318255A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
WO2006033235A1 (ja) | 画像処理方法、画像処理装置、撮像装置及び画像処理プログラム | |
JP2005192162A (ja) | 画像処理方法、画像処理装置及び画像記録装置 | |
WO2006077702A1 (ja) | 撮像装置、画像処理装置及び画像処理方法 | |
US20050128539A1 (en) | Image processing method, image processing apparatus and image recording apparatus | |
WO2006033236A1 (ja) | 画像処理方法、画像処理装置、撮像装置及び画像処理プログラム | |
JP2005192158A (ja) | 画像処理方法、画像処理装置及び画像記録装置 | |
JP2006039666A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
JP2005203865A (ja) | 画像処理システム | |
US20030112483A1 (en) | Image forming method | |
WO2006033234A1 (ja) | 画像処理方法、画像処理装置、撮像装置及び画像処理プログラム | |
JP4449619B2 (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
JP2006094000A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
JP2007312125A (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP2005332054A (ja) | 画像処理方法、画像処理装置、画像記録装置及び画像処理プログラム | |
JP2004096508A (ja) | 画像処理方法、画像処理装置、画像記録装置、プログラム及び記録媒体 | |
JP2006203571A (ja) | 撮像装置、画像処理装置及び画像記録装置 | |
WO2006077703A1 (ja) | 撮像装置、画像処理装置及び画像記録装置 | |
JP2006092168A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム | |
JP2006293898A (ja) | 画像処理方法、画像処理装置及び画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006513530 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |