JP5369751B2 - Image processing apparatus, imaging apparatus, and image processing program - Google Patents

Image processing apparatus, imaging apparatus, and image processing program Download PDF

Info

Publication number
JP5369751B2
JP5369751B2 JP2009038076A JP2009038076A JP5369751B2 JP 5369751 B2 JP5369751 B2 JP 5369751B2 JP 2009038076 A JP2009038076 A JP 2009038076A JP 2009038076 A JP2009038076 A JP 2009038076A JP 5369751 B2 JP5369751 B2 JP 5369751B2
Authority
JP
Japan
Prior art keywords
subject
conversion
image processing
image
appearance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009038076A
Other languages
Japanese (ja)
Other versions
JP2010193375A (en
Inventor
麻理 杉原
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2009038076A priority Critical patent/JP5369751B2/en
Publication of JP2010193375A publication Critical patent/JP2010193375A/en
Application granted granted Critical
Publication of JP5369751B2 publication Critical patent/JP5369751B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an appearance conversion process for matching the appearance of an image between two different viewing environments.

  Conventionally, various techniques have been proposed for performing an appearance conversion process on an image so that the appearances of the images match between two different viewing environments. In particular, appearance conversion processing using an appearance model typified by CIECAM97s and CIECAM02 is disclosed in cited documents 1, 2, 3, and the like. The conversion formulas of the appearance conversion processing are based on experimental data of human perception with respect to the color chart (see Patent Documents 1 to 5).

  However, when conventional appearance conversion processing is performed on an image actually captured by a camera, it has been found that the appearance expected by the user may not be achieved.

  Therefore, an object of the present invention is to provide an image processing apparatus, an imaging apparatus, and an image processing program capable of performing appropriate appearance conversion processing even if the conversion target is an image captured by a camera.

The image processing apparatus of the present invention performs appearance conversion processing on a captured image observed under the first viewing environment, thereby allowing the captured image to be viewed under a second viewing environment different from the first viewing environment. Conversion means for making the same as the appearance of the photographed image in the first viewing environment, and a subject area where the subject exists from the photographed image, and a subject attribute of the subject present in the subject area is detected. Detection means and setting means for setting appearance conversion processing parameters for each subject area and each subject attribute in the photographed image are provided.

  The appearance conversion process may include a chromatic adaptation conversion process, and the set parameter may include a parameter of the adaptation level used in the chromatic adaptation conversion process.

The setting unit, when the subject attribute was non artifacts, subject attribute may be set to a higher adaptability as compared with the case was an artifact.

The setting means may set the adaptability to the maximum value when the subject attribute is food.

  The appearance conversion process may include a saturation conversion process, and the parameters set by the setting unit may include a saturation gain parameter used in the saturation conversion process.

The setting unit, when the subject attribute was person may be set as compared to when the subject attribute is other than the person the saturation gain to be low.

The setting means may set the saturation gain higher when the subject attribute is an outdoor object than when the subject attribute is an indoor object.

  The appearance conversion process may include a contrast conversion process, and the parameter set by the setting unit may include a parameter of a contrast function used in the contrast conversion process.

The setting means may set the contrast function higher when the subject attribute is an indoor object than when the subject attribute is an outdoor object.

  In addition, the image processing apparatus of the present invention performs appearance conversion processing on a captured image observed under the first viewing environment, so that the second viewing environment is different from the first viewing environment. Conversion means for making the appearance of the photographed image the same as the appearance of the photographed image in the first viewing environment, and a subject area where a subject is present is detected from the photographed image and is present in the subject area Detection means for detecting the subject attribute of the subject, and when the subject attribute is food, the adaptation parameter of the appearance conversion process is set regardless of the first visual environment and the second visual environment. Setting means for setting the maximum value.

  The imaging apparatus of the present invention includes an imaging unit that captures an image of a subject and generates an image, and any one of the image processing apparatuses of the present invention.

In addition, the image processing program of the present invention performs an appearance conversion process on a captured image observed under the first viewing environment, thereby capturing the captured image under a second viewing environment different from the first viewing environment. Conversion procedure for making the appearance of the subject the same as the appearance of the image in the first viewing environment, and detecting the subject area where the subject is present from the captured image and detecting the subject attribute of the subject existing in the subject region And a setting procedure for setting a parameter for appearance conversion processing for each subject area in the photographed image and for each subject attribute .

  In addition, the image processing program of the present invention performs appearance conversion processing on a captured image observed under the first viewing environment, so that the second viewing environment is different from the first viewing environment. A conversion procedure for making the appearance of the photographed image the same as the appearance of the photographed image in the first viewing environment, and detecting a subject area where a subject is present from the photographed image and existing in the subject area If the detection procedure for detecting the subject attribute of the subject and the subject attribute is food, the adaptation parameter of the appearance conversion process is set regardless of the first visual environment and the second visual environment. Let the computer execute the setting procedure to set the maximum value.

  According to the present invention, an image processing apparatus, an imaging apparatus, and an image processing program capable of performing appropriate appearance conversion processing even when the conversion target is an image captured by a camera are realized.

Block diagram showing schematic configuration of electronic camera Flowchart for explaining appearance conversion processing by CPU 18 Diagram explaining area division

[Embodiment]
Hereinafter, an embodiment of the present invention will be described. FIG. 1 is a block diagram showing a schematic configuration of an electronic camera. As shown in FIG. 1, the electronic camera 11 includes an imaging optical system 12, a lens driving unit 13, a diaphragm 14, a diaphragm driving unit 15, a color imaging device 16, an AFE 17, a CPU 18, and a first memory 19. A second memory 20, a media I / F 21, a communication I / F 22, a monitor 23, a release button 24, a photometric sensor 30, and an environment sensor 31. Here, the lens driving unit 13, the aperture driving unit 15, the AFE 17, the first memory 19, the second memory 20, the media I / F 21, the communication I / F 22, the monitor 23, the release button 24, the photometric sensor 30, and the environment sensor 31 are Are respectively connected to the CPU 18.

  The imaging optical system 12 includes a plurality of lenses including a focusing lens. The focal position of the imaging optical system 12 is adjusted in the optical axis direction by the lens driving unit 13. For simplicity, the imaging optical system 12 is illustrated as a single lens in FIG. The diaphragm 14 adjusts the amount of light per unit time incident on the image sensor 16. The aperture of the aperture 14 is adjusted by the aperture drive unit 15 in accordance with an instruction from the CPU 18.

  The imaging element 16 captures a subject image formed by the imaging optical system 12 during shooting and generates an image signal of the captured image. The image signal output from the image sensor 16 is input to the AFE 17.

  The AFE 17 is an analog front end circuit that performs analog signal processing on the image signal output from the image sensor 16. This analog signal processing includes correlated double sampling, image signal gain adjustment, A / D conversion of the image signal, and the like. The digital image signal output from the AFE 17 is input to the CPU 18.

  The CPU 18 is a processor that comprehensively controls the operation of the electronic camera 11. Further, the CPU 18 performs various image processing (color interpolation processing, gradation conversion processing, contour enhancement processing, white balance adjustment, color conversion processing, etc.) on the image signal after A / D conversion at the time of shooting. The unit 25 functions. Note that the CPU 18 captures image data that has undergone all image processing by the image processing unit 25 (captured image data in a normal format) and data of captured image that has undergone only color interpolation processing by the image processing unit 25 ( RAW format captured image data).

  The CPU 18 functions as a recognition processing unit 26 that performs recognition processing such as pattern matching on the captured image.

  Further, the CPU 18 can perform appearance conversion processing on the RAW-format captured image data written in the storage medium 28 using the image processing unit 25 and the recognition processing unit 26. Details of the appearance conversion process will be described later.

  The first memory 19 is composed of a volatile storage medium (SDRAM or the like), and temporarily stores a captured image in a pre-process or post-process of image processing by the CPU 18. The second memory 20 is configured by a non-volatile storage medium such as a flash memory. The second memory 20 stores a program executed by the CPU 18.

  The media I / F 21 can removably connect a nonvolatile storage medium 28. The media I / F 21 writes captured image data after image processing (here, captured image data in the normal format and captured image data in the RAW format) to the storage medium 28, or captures the captured image from the storage medium 28. (In this case, the captured image data in the normal format and the captured image data in the RAW format) are read.

  The storage medium 28 is configured by a hard disk, a memory card incorporating a semiconductor memory, or the like. Note that a memory card as an example of the storage medium 28 is shown in FIG.

  The communication I / F 22 controls data transmission / reception with an external device connected via a known wired or wireless communication line in accordance with a predetermined communication standard.

  The monitor 23 reproduces and displays the captured image written in the storage medium 28 under the control of the CPU 18. Each of the display brightness and display color balance of the monitor 23 can be arbitrarily set by the user, and the setting contents of the monitor 23 by the user are appropriately detected by the CPU 18.

  The photometric sensor 30 detects the brightness of the scene of the electronic camera (brightness of the shooting scene) at the time of shooting, and sends a signal indicating the brightness (photometric value of the scene) to the CPU 18. This field metering value is written to the storage medium 28 as additional information of the captured image data together with captured image data (here, captured image data in the normal format and captured image data in the RAW format) acquired by capturing. .

  The environment sensor 31 detects brightness around the electronic camera (brightness around the user) at the time of shooting, and sends a signal (environment photometric value) indicating the brightness to the CPU 18. This environmental photometric value is written into the storage medium 28 as additional information of the captured image data together with captured image data (here, captured image data in the normal format and captured image data in the RAW format) acquired by imaging.

  The operation button 24 receives an instruction input for starting an AF operation and an instruction input for starting an imaging operation from a user during shooting. In addition, the operation button 24 receives an instruction to start the appearance conversion process regarding the captured image from the user. In the appearance conversion process, the operation button 24 receives input of information necessary for the appearance conversion process from the user. Information necessary for the appearance conversion process includes viewing-side viewing environment conditions (input-side viewing environment conditions) and viewing-side viewing environment conditions (output-side viewing environment conditions).

  Here, the appearance conversion process of the present embodiment is “feeling at the time of shooting when a user picks up an image captured in a past viewing environment on a monitor 23 under another viewing environment. It is assumed that this is an “appearance conversion process to make it look the same as the appearance of the object scene”. In this case, the electronic camera can automatically acquire both the input-side visual environment condition (the visual environment condition at the time of photographing) and the output-side visual environment condition (the visual environment condition at the time of reproduction).

  Therefore, in this embodiment, the electronic camera automatically acquires both the input-side visual environment condition and the output-side visual environment condition.

  FIG. 2 is a flowchart for explaining the appearance conversion process by the CPU 18. The appearance conversion process is executed by the CPU 18 according to a program when an instruction to start the appearance conversion process is input during reproduction display of the captured image. In this example, CIECAM02 is used as an appearance model for appearance conversion processing (color appearance model independent of the viewing environment).

Step S11: The CPU 18 reads out the RAW-format captured image data relating to the captured image being reproduced and displayed, and additional information (field metering value and environmental metering value) of the captured image from the storage medium 28, and the captured image data. And the input side visual environment condition, that is, the input side adaptation white point (X wi , Y wi , Z wi ), the input side adaptation luminance La i , and the input side ambient condition F i are calculated. .

Further, the CPU 18 refers to the setting contents (display luminance and display color balance) of the monitor 23 at the present time and the environmental photometric values detected by the environment sensor 31 at the present time, and the setting contents (display luminance and display color balance) and Based on the environment photometric value, the visual environment condition on the output side, that is, the adaptation white point (X wo , Y wo , Z wo ) on the output side, the adaptation luminance La o on the output side, and the ambient condition F o on the output side are calculated.

Here, the adaptation white point (X wi , Y wi , Z wi ), adaptation luminance La i , ambient condition F i , adaptation white point (X wo , Y wo , Z wo ), adaptation luminance La o , ambient condition F o Are defined as follows:

Adaptation white point (X wi , Y wi , Z wi ): a light source color of the object scene at the time of shooting. The light source color is estimated based on the captured image (RAW-format captured image data) being reproduced and displayed. For the estimation, any known light source determination method employed in camera auto white balance control can be employed.

Adaptation luminance La i is the brightness of the object scene at the time of shooting. Here, out of the field metering value added to the captured image, the field metering value of the portion corresponding to the adaptive white point (X wi , Y wi , Z wi ) described above is set to 1/5 times.

Ambient condition F i is the brightness around the user at the time of shooting. Here, it is set as the environmental photometric value added to the captured image.

Adapting white point (X wo , Y wo , Z wo ): A color that the monitor 23 displays as white during reproduction. Here, the display color balance currently set on the monitor 23 is used.

Adaptation luminance La o is the brightness of the monitor 23 during reproduction. Here, it is set to 1/5 times the display luminance currently set on the monitor 23.

Ambient condition F o is the brightness around the user during playback. Here, it is set as the environmental photometric value detected by the environmental sensor 31 at the present time.

Step S12: The CPU 18 calculates the human adaptation D i with respect to the visual environment on the input side by applying the adaptation luminance La i and the ambient condition F i calculated in step S11 to the equation (1). However, adaptability D i calculated here is a provisional value and may be corrected later.

Incidentally, adaptability D i calculated by Equation (1) is the same as the adaptability that was used in the color adaptation conversion process of a conventional appearance conversion definitive input.

Further, the CPU 18 calculates the human adaptation degree Do with respect to the visual environment on the output side by applying the adaptation luminance La o and the ambient condition F o calculated in step S11 to the expression (2). However, adaptability D o calculated here is a provisional value and may be corrected later.

Incidentally, adaptability D o calculated by the equation (2) is the same as the adaptability that was used in the color adaptation conversion process on the output side of the conventional appearance conversion processing (color adaptation inverse transform process).

Further, the CPU 18 calculates the saturation gain G to be used in the saturation conversion processing by applying the adaptation luminances La i and La o calculated in step S11 to the equation (3). However, the saturation gain G calculated here is a provisional value and may be corrected later.

  Incidentally, the saturation gain G calculated by Expression (3) is the same as the saturation gain used in the saturation conversion processing in the conventional appearance conversion processing.

Further, the CPU 18 calculates the contrast function T expressed by the equation (4) by applying the adaptation luminance La i and the ambient condition F i calculated in step S11 to a predetermined equation. However, the contrast function T calculated here is a provisional function and may be corrected later.

  Incidentally, the contrast function T in Expression (4) is the same as the contrast function used in the contrast conversion process on the input side in the conventional appearance conversion process.

Further, the CPU 18 calculates the contrast function T −1 represented by the equation (5) by applying the adaptation luminance La o and the ambient condition F o calculated in step S11 to a predetermined equation. However, the contrast function T −1 calculated here is a provisional function and may be corrected later.

Note that the contrast function T −1 in Expression (5) is the same as the contrast function used in the contrast conversion process on the output side (contrast reverse conversion process) in the conventional appearance conversion process.

  Step S13: The CPU 18 performs subject extraction processing on the captured image (RAW format captured image data) being reproduced and displayed, and divides the captured image into one or more subject regions and a background region, and after the division Are recognized, and region labels having values of 1 to N are individually assigned to the N regions.

  It should be noted that at least one of the area dividing methods described in Patent Documents 4 and 5 and the area dividing method based on the graph cut theory can be adopted for the area division in this step.

  For example, as shown in FIG. 3A, in a captured image, a first subject 41 made of a person, a second subject 42 made of a radish (a kind of food), and a track (a kind of artifact). As shown in FIG. 3B, the captured image includes the presence area A1 of the first subject (person) and the second subject (radish). ) Existing area A 2, third subject (track) existing area A 3, and background area A 4. In this case, the number N of regions is 4. Further, for example, the value of the area label given to the area A1 is 1, the value of the area label given to the area A2 is 2, the value of the area label given to the area A3 is 3, and is given to the area A4. The value of the assigned region label is 4.

  Step S14: The CPU 18 sets the value of the region label L to be processed (hereinafter referred to as “processing target label L”) to the initial value (1).

  Step S15: The CPU 18 corresponds to the region corresponding to the processing target label L (for example, if the value of the processing target label L is 1, among the captured images (RAW-format captured image data) being reproduced and displayed) By performing pattern recognition processing on A1), it is determined whether an object existing in the region belongs to food, a person, or the other. Then, the CPU 18 proceeds to step S17 when the object is food, proceeds to step S18 when the object is a person, and proceeds to step S18 when the object is neither food nor a person. The process proceeds to S16.

  Step S16: The CPU 18 performs light source determination processing on a region corresponding to the processing target label L in the captured image (RAW-format captured image data) being reproduced and displayed, thereby photographing an object present in the region. The type of the light source that is sometimes illuminated is determined, and it is determined whether the light source is natural light or artificial light. It should be noted that any one of known light source determination methods employed for camera auto white balance control can be applied to such a light source determination method.

  When the light source is natural light (that is, when the object is an outdoor object), the CPU 19 proceeds to step S20, and when the light source type is artificial light (that is, the object is indoors). If it is an object), the process proceeds to step S21. If the type of the light source cannot be determined (for example, the light source color is a mixed color of natural light and artificial light), the process proceeds to step S22.

Step S17: The CPU 18 replaces each of the adaptation degrees D i and D o with the highest value (ie, 1) regardless of the calculation result in step S12, and proceeds to step S22.

Step S18: The CPU 18 corrects the adaptation degrees D i and D o by the following equation (6).

However, the correction coefficient Cd in this step is set to a coefficient for increasing the adaptability D i and D o , for example, 1.1. In addition, when either of the adaptation degrees D i and D o exceeds 1 by the correction, it is replaced with 1 regardless of the correction result. Thereafter, the CPU 18 proceeds to step S19.

  Step S19: The CPU 18 determines whether or not the saturation gain G exceeds 1, and only when the saturation gain G exceeds 1, the saturation gain G is calculated by the following equation (7). to correct.

  However, the correction coefficient Md in this step is set to a coefficient for suppressing the saturation gain G, for example, 0.9. Thereafter, the CPU 18 proceeds to step S22.

  Step S20: The CPU 18 corrects the saturation gain G by the above equation (7). However, the correction coefficient Md in this step is set to a coefficient for increasing the saturation gain G, for example, 1.1. Thereafter, the CPU 18 proceeds to step S22.

Step S21: The CPU 18 corrects the contrast functions T and T −1 by the following equation (8).

However, the correction coefficient Td in this step is set to a coefficient for increasing the contrast functions T and T −1 , for example, 1.1. Thereafter, the CPU 18 proceeds to step S22.

Step S22: The CPU 18 applies the adaptability D i , D o , the saturation gain G, the contrast function T, to the area corresponding to the processing target label L in the captured image (RAW format captured image data) being reproduced and displayed. Appearance conversion processing using T- 1 is performed. The procedure of the appearance conversion process is the following (a) to (g).

(A) The CPU 18 represents the color of each pixel in the region corresponding to the processing target label L by coordinates (X i , Y i , Z i ) in the XYZ color space.

(B) The CPU 18 applies the coordinates (X i , Y i ) by fitting the adaptation white point (X wi , Y wi , Z wi ) calculated in step S1 and the adaptation degree D i at the present time to the equation (9). , Z i ) to obtain the conversion formula from the chromatic adaptation transformation to the coordinates (X i ′, Y i ′, Z i ′).

However, (X wr , Y wr , Z wr ) in the equation (9) is a white point (reference white point) of the CIECAM02 color space, and the matrix M cat02 in the equation (9) is a known value defined by CIECAM02. Matrix.

(C) The CPU 18 converts the coordinates (X i , Y i , Z i ) of each pixel to the coordinates (X i ′, Y i ′, Z i ′) after chromatic adaptation conversion by the conversion formula acquired in (b). And convert.

(D) The CPU 18 performs contrast conversion processing on the input side by the contrast function T (specifically, including contrast conversion processing) with respect to the coordinates (X i ′, Y i ′, Z i ′) after chromatic adaptation conversion. The coordinates (J, C i , h) after contrast conversion are acquired by performing a conversion process corresponding to visual nonlinearity. Incidentally, J is the lightness component, C i is the saturation component, h is the hue component.

(E) The CPU 18 obtains coordinates (J, C o , h) after saturation conversion by multiplying the saturation component C i of the coordinates (J, C i , h) by a saturation gain G. To do. If this saturation adjustment is expressed by an equation, C o = C i G.

(F) The CPU 18 performs contrast conversion processing (contrast reverse conversion processing) on the output side using the contrast function T −1 on the coordinates (J, C o , h) after saturation conversion, thereby performing post-contrast conversion The coordinates (X o , Y o , Z o ) are acquired.

(G) The CPU 18 performs chromatic adaptation conversion processing (chromatic adaptation inverse transformation processing) on the output side with respect to the coordinates (X o , Y o , Z o ) after the contrast inverse transformation, thereby performing the color adaptation inverse transformation. The coordinates (X o ', Y o ', Z o ') are acquired. These coordinates (X o ′, Y o ′, Z o ′) are the coordinates after appearance conversion.

The conversion formula used by the color adaptation inverse conversion processing in the inverse conversion equation conversion formula represented by Formula (9), fitting the adaptability D o instead of adaptability D i, and adaptive white point This corresponds to a case where an adaptive white point (X wo , Y wo , Z wo ) is applied instead of (X wi , Y wi , Z wi ) (step S22).

  Step S23: The CPU 18 determines whether or not the value of the processing target label L has reached the number of areas N. If not, the process proceeds to step S24, and if it has reached, the process proceeds to step S25.

  Step S24: The CPU 18 increments the value of the processing target label L by 1, and then returns to step S15.

Step S25: The CPU 18 performs image processing for the output device on the coordinates (X o ′, Y o ′, Z o ′) after appearance conversion (here, image processing for the monitor 23, Image processing including color gamut mapping according to output characteristics). Note that this image processing may include image processing for matching the user's preference. Then, the CPU 18 displays the captured image after the image processing on the monitor 23 instead of the captured image being reproduced and displayed. Thereafter, when a save instruction is input from the user, the CPU 18 writes the captured image after the image processing into the storage medium 28 together with the RAW format captured image data, and ends the flow.

  As described above, in the appearance conversion process of the present embodiment, the CPU 18 sets the parameters of the appearance conversion process according to the type of subject existing in the captured image (steps S15 to S19).

  Specifically, the CPU 18 of the present embodiment counts the adaptation degree D when the subject is a non-artificial object higher than the adaptation degree D when the subject is an artifact (step S17, S18).

  It is considered that a human may feel a different color even if the subject is the same color depending on whether or not the subject is a non-artifact (person or food). For example, even if a human observes a banana and cup of the same color under the same light source (for example, under an incandescent lamp), the non-artifact banana is well-known, so it is colored by the light source (incandescent It is completely removed in the head (orange if it is a light), but it is colored by the light source (orange if it is an incandescent light) because it often does not know the actual color of the artificial cup. Is considered not to be removed in the head.

  Therefore, as in the appearance conversion process of the present embodiment, the adaptation degree D when the subject is a non-artificial object (person or food) is counted higher than the adaptation degree D when the subject is an artifact. Thus, regardless of whether or not the subject is a non-artifact, good image reproduction as expected by the user can be performed.

  Incidentally, the conventional appearance conversion process is based only on the human perception of the color chart (that is, the artifact), so that it is possible to perform a good image reproduction as expected by the user for the cup that is an artifact. As for the non-artificial banana, it was not possible to reproduce the image as expected by the user.

  Further, the CPU 18 of the present embodiment sets the adaptation degree D when the subject is food to the highest value (that is, 1) (step S17).

  Since humans are considered to memorize food colors particularly well, it is considered that this setting can bring food image reproduction closer to the user's expectations.

  Further, the CPU 18 of the present embodiment controls so that the saturation gain G is not significantly increased when the subject is a person (step S19).

  Humans usually have the characteristic of increasing the saturation when moving things from dark to bright, so saturation conversion with saturation gain G is effective, but only for human skin. It is thought that the saturation does not increase so much due to the influence of past memories.

  Thus, if the subject is a person and the control is performed so that the saturation gain G does not remarkably increase, the possibility that the image reproduction of the person will deviate from the user's expectation is reduced.

  Further, the CPU 18 of this embodiment counts the saturation gain G when the subject is an outdoor object higher than the saturation gain G when the subject is an indoor object (step S20).

  Humans usually have the characteristic of increasing saturation when moving things from dark to bright, so saturation conversion with saturation gain G is effective, but the subject is an outdoor object. In such a case, it is considered that the amount of improvement in saturation is particularly remarkable.

  Therefore, if the saturation gain G when the subject is an outdoor object is counted higher as in this embodiment, it is considered that the image reproduction of the outdoor object can be brought close to the user's expectation.

  Further, the CPU 18 of the present embodiment counts the contrast function T when the subject is an indoor object higher than the contrast function T when the subject is an outdoor object (step S21).

  Since a human usually has a characteristic that a sense of contrast increases when a thing is viewed in a dark place, contrast conversion by the contrast function T is effective. However, if the subject is an indoor object, the amount of improvement in the sense of contrast Is considered particularly prominent.

  Therefore, if the contrast function T when the subject is an indoor object is added as in the present embodiment, it is considered that the image reproduction of the indoor object can be brought close to the user's expectation.

  Further, the CPU 18 of the present embodiment sets the appearance conversion process parameters for each object (that is, for each area) present in the image, not for each image. Therefore, in this embodiment, the image reproduction of all objects present in the image can be brought close to the user's expectation.

[Modification]
The CPU 18 of the present embodiment uses CIECAM02 as the color appearance model, but other appearance models such as CIECAM97s may be used.

  In addition, the CPU 18 of the present embodiment performs the appearance conversion process parameter setting according to the type of subject existing in the captured image, but may be performed according to the type of shooting scene of the electronic camera. For example, the contrast function T and the saturation gain G when the shooting scene is a summer landscape may be calculated larger than the contrast function T and the saturation gain G when the shooting scene is another landscape. (In other words, the correction coefficients Td and Md may be 1 or more). Further, for example, the adaptation degree D when the shooting scene is a press conference scenery may be calculated to be larger than the adaptation degree D when the shooting scene is another scenery (that is, the correction coefficient Cd is set to 1). May be close).

  In the electronic camera of the present embodiment, the timing for performing the appearance conversion process on the captured image is when the captured image is played back, but may be at another timing, for example, when the captured image is captured.

  However, when the timing of the appearance conversion process is set to the time when the captured image is not reproduced, the viewing environment condition on the output side cannot be automatically acquired by the electronic camera, so the user needs to input it. In this case, the user inputs to the electronic camera what visual environment conditions the user wants to reproduce the captured image.

  In the appearance conversion process of the present embodiment, the input-side visual environment condition is the visual environment condition at the time of shooting, and the output-side visual environment condition is the visual environment condition at the time of reproduction. The condition may be a visual environment condition during a certain reproduction, and the visual environment condition on the output side may be a visual environment condition during another reproduction. In this case, the appearance conversion processing is as follows: “When the user reproduces and displays a captured image reproduced and displayed on the monitor 23 in a certain viewing environment on the monitor 23 in another certain viewing environment, "The appearance conversion process to make it look the same."

  Further, in the appearance conversion process of the present embodiment, both the input-side visual environment condition and the output-side visual environment condition are the visual environment conditions when the electronic camera is used. At least one of the visual environment conditions on the output side may be a visual environment condition when the electronic camera is not used. For example, the viewing environment condition on the output side may be a viewing environment condition when a captured image is observed on a computer monitor. Further, the viewing environment condition on the output side may be a viewing environment condition when the captured image is observed on the print. However, in any case, it is necessary for the user to input information that cannot be automatically acquired by the electronic camera.

  In this embodiment, the electronic camera performs all of the appearance conversion processing. However, part or all of the appearance conversion processing is installed in an image processing device other than the electronic camera, for example, an image storage device, a printer, or a computer. Image processing software or the like may be executed.

  DESCRIPTION OF SYMBOLS 11 ... Electronic camera, 12 ... Imaging optical system, 13 ... Lens drive part, 14 ... Diaphragm, 15 ... Diaphragm drive part, 16 ... Imaging element, 18 ... CPU, 25 ... Image processing part, 26 ... Recognition process part

JP 2006-135877 A JP 2006-211093 A Japanese Patent Application Laid-Open No. 2000-148979 JP-A-4-271383 JP 2001-43376 A

Claims (13)

  1. By performing the appearance conversion process on the captured image observed under the first viewing environment, the appearance of the captured image under the second viewing environment different from the first viewing environment is obtained. Conversion means for making the captured image look the same under the visual environment of
    Detecting means for detecting a subject area where a subject exists from the photographed image, and detecting a subject attribute of the subject existing in the subject area;
    An image processing apparatus comprising: setting means for setting parameters of the appearance conversion processing for each subject area and each subject attribute in the photographed image .
  2. The image processing apparatus according to claim 1.
    In the appearance conversion process,
    Includes chromatic adaptation conversion processing,
    The parameters to be set include
    An image processing apparatus comprising: a parameter of adaptation used in the chromatic adaptation conversion process.
  3. The image processing apparatus according to claim 2,
    The setting means includes
    An image processing apparatus, wherein when the subject attribute is a non-artifact, the adaptation degree is set higher than when the subject attribute is an artifact.
  4. The image processing apparatus according to claim 2,
    The setting means includes
    When the subject attribute is food, the degree of adaptation is set to a maximum value.
  5. The image processing apparatus according to claim 1.
    In the appearance conversion process,
    Includes saturation conversion processing,
    The parameters set by the setting means include
    An image processing apparatus comprising a saturation gain parameter used in the saturation conversion process.
  6. The image processing apparatus according to claim 5.
    The setting means includes
    Wherein when the object attributes were a person, the image processing apparatus, characterized in that the object attribute is set lower the saturation gain as compared with the case it is other than a person.
  7. The image processing apparatus according to claim 5.
    The setting means includes
    The image processing apparatus, wherein when the subject attribute is an outdoor object, the saturation gain is set higher than when the subject attribute is an indoor object.
  8. The image processing apparatus according to claim 1.
    In the appearance conversion process,
    Contrast conversion processing is included,
    The parameters set by the setting means include
    An image processing apparatus comprising a parameter of a contrast function used in the contrast conversion process.
  9. The image processing apparatus according to claim 8.
    The setting means includes
    The image processing apparatus, wherein when the subject attribute is an indoor object, the contrast function is set higher than when the subject attribute is an outdoor object.
  10.   By performing the appearance conversion process on the captured image observed under the first viewing environment, the appearance of the captured image under the second viewing environment different from the first viewing environment is obtained. Conversion means for making the captured image look the same under the visual environment of
      Detecting means for detecting a subject area where a subject exists from the photographed image, and detecting a subject attribute of the subject existing in the subject area;
      Setting means for setting the adaptation parameter of the appearance conversion process to a maximum value regardless of the first visual environment and the second visual environment when the subject attribute is food;
      An image processing apparatus comprising:
  11. Imaging means for capturing an image of a subject and generating an image;
    An image processing apparatus comprising: the image processing apparatus according to claim 1.
  12. By performing the appearance conversion process on the captured image observed under the first viewing environment, the appearance of the captured image under the second viewing environment different from the first viewing environment is obtained. Conversion procedure to make the captured image look the same under the visual environment of
    A detection procedure for detecting a subject area where a subject is present from the photographed image and detecting a subject attribute of the subject present in the subject region;
    An image processing program causing a computer to execute a setting procedure for setting the parameters of the appearance conversion processing for each subject area and each subject attribute in the photographed image .
  13.   By performing the appearance conversion process on the captured image observed under the first viewing environment, the appearance of the captured image under the second viewing environment different from the first viewing environment is obtained. Conversion procedure to make the captured image look the same under the visual environment of
      A detection procedure for detecting a subject area where a subject is present from the photographed image and detecting a subject attribute of the subject present in the subject region;
      A setting procedure for setting the adaptation parameter of the appearance conversion process to the highest value regardless of the first visual environment and the second visual environment when the subject attribute is food;
      An image processing program for causing a computer to execute.
JP2009038076A 2009-02-20 2009-02-20 Image processing apparatus, imaging apparatus, and image processing program Active JP5369751B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009038076A JP5369751B2 (en) 2009-02-20 2009-02-20 Image processing apparatus, imaging apparatus, and image processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009038076A JP5369751B2 (en) 2009-02-20 2009-02-20 Image processing apparatus, imaging apparatus, and image processing program

Publications (2)

Publication Number Publication Date
JP2010193375A JP2010193375A (en) 2010-09-02
JP5369751B2 true JP5369751B2 (en) 2013-12-18

Family

ID=42818902

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009038076A Active JP5369751B2 (en) 2009-02-20 2009-02-20 Image processing apparatus, imaging apparatus, and image processing program

Country Status (1)

Country Link
JP (1) JP5369751B2 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1032723A (en) * 1996-07-15 1998-02-03 Fuji Xerox Co Ltd Image processor
JP4467202B2 (en) * 2001-03-16 2010-05-26 株式会社リコー Color conversion apparatus and color conversion method
JP2003244627A (en) * 2002-02-20 2003-08-29 Konica Corp Image processing method, image processing program, recording medium for recording the image processing program, image processing apparatus, and image recording apparatus
JP2004274720A (en) * 2003-02-18 2004-09-30 Fuji Photo Film Co Ltd Data conversion apparatus and data conversion program
JP2005210526A (en) * 2004-01-23 2005-08-04 Konica Minolta Photo Imaging Inc Image processing apparatus, method, and program, image pickup device, and image data outputting method and program
JP2005346474A (en) * 2004-06-03 2005-12-15 Canon Inc Image processing method and image processor and program and storage medium
JP4860914B2 (en) * 2004-12-07 2012-01-25 キヤノン株式会社 Image projection apparatus, image projection method, and program
JP4712635B2 (en) * 2006-07-27 2011-06-29 富士フイルム株式会社 Data correction method, apparatus and program
JP4850676B2 (en) * 2006-12-07 2012-01-11 キヤノン株式会社 Image generating apparatus and image generating method

Also Published As

Publication number Publication date
JP2010193375A (en) 2010-09-02

Similar Documents

Publication Publication Date Title
US9007486B2 (en) Image capturing apparatus and control method thereof
US10412296B2 (en) Camera using preview image to select exposure
KR101549529B1 (en) Image capture apparatus, control method thereof, and recording medium
US8269866B2 (en) Image processing apparatus, method, program and image pickup apparatus
JP5108093B2 (en) Imaging apparatus and imaging method
US8446485B2 (en) Image processing apparatus, image processing method, and storage medium thereof
US8493502B2 (en) Image pickup apparatus, image pickup method, and storage medium storing program
JP4293174B2 (en) Imaging apparatus and image processing apparatus
JP4466261B2 (en) Imaging apparatus, brightness correction method, and program
US7084907B2 (en) Image-capturing device
US8368779B2 (en) Image processing apparatus for performing gradation correction on subject image
KR101900097B1 (en) Image capturing method and image capturing apparatus
US8526683B2 (en) Image editing apparatus, method for controlling image editing apparatus, and recording medium storing image editing program
JP3668014B2 (en) Image processing method and apparatus
EP2426928B1 (en) Image processing apparatus, image processing method and program
KR101633460B1 (en) Method and Apparatus for controlling multi-exposure
US8363131B2 (en) Apparatus and method for local contrast enhanced tone mapping
US20110242366A1 (en) Imaging apparatus, imaging method, storage medium, and integrated circuit
JP6083897B2 (en) Imaging apparatus and image signal processing apparatus
JP5321163B2 (en) Imaging apparatus and imaging method
JP4218723B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
US8830348B2 (en) Imaging device and imaging method
JPWO2010053029A1 (en) Image input device
EP2426927B1 (en) Image processing apparatus, image processing method and computer program
US8928783B2 (en) Imaging apparatus including switchable edge extraction

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120216

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121024

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130418

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130514

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130712

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130820

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130902

R150 Certificate of patent or registration of utility model

Ref document number: 5369751

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250