WO2021014714A1 - Image processing device, image display device, image processing method, and program - Google Patents

Image processing device, image display device, image processing method, and program Download PDF

Info

Publication number
WO2021014714A1
WO2021014714A1 PCT/JP2020/018560 JP2020018560W WO2021014714A1 WO 2021014714 A1 WO2021014714 A1 WO 2021014714A1 JP 2020018560 W JP2020018560 W JP 2020018560W WO 2021014714 A1 WO2021014714 A1 WO 2021014714A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
pixels included
image processing
unit
Prior art date
Application number
PCT/JP2020/018560
Other languages
French (fr)
Japanese (ja)
Inventor
熊倉 威
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2021014714A1 publication Critical patent/WO2021014714A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Definitions

  • the present disclosure relates to an image processing device, an image display device, an image processing method, and a program.
  • the present application claims priority based on Japanese Patent Application No. 2019-133404 filed in Japan on July 19, 2019, the contents of which are incorporated herein by reference.
  • Patent Document 1 discloses an image processing method capable of improving the image quality by restoring the brightness and color even for an input face image whose brightness and color have deteriorated due to shooting in low illuminance or the like.
  • one aspect of the present disclosure is, for example, to provide an image processing apparatus or the like that performs more versatile color component adjustment processing.
  • the image processing apparatus is, for example, a reference image that generates a reference image in which the color components of at least some of the pixels included in the input image are changed based on the input image.
  • a color difference information calculation unit that calculates color difference information based on the color difference between the generation unit, the plurality of pixels included in the input image, and the plurality of pixels included in the reference image, and the color difference information calculation unit included in the input image. It is provided with a color component adjusting unit that generates an output image in which the color components of a plurality of pixels are adjusted by using the color difference information.
  • the control method of the image processing device is, for example, to generate a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed based on the input image.
  • a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed is provided to an image processing device based on the input image.
  • the output image in which the color components of the plurality of pixels included in the above are adjusted is realized with the color component adjusting function of generating the output image by using the color difference information.
  • FIG. It is a block diagram which shows an example of the main part structure of the image processing apparatus which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the said image processing apparatus.
  • It is a block diagram which shows an example of the main part structure of the color difference information calculation part provided in the said image processing apparatus.
  • It is a block diagram which shows an example of the main part structure of the color component adjustment part provided in the said image processing apparatus.
  • It is a flowchart which shows an example of the processing executed by the said image processing apparatus.
  • FIG. It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the image processing apparatus which concerns on Embodiment 3.
  • the image processing apparatus generates, for example, a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed based on the input image. You can.
  • the image processing device may calculate color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image, for example.
  • the image processing apparatus may generate, for example, an output image in which the color components of a plurality of pixels included in the input image are adjusted based on the color difference information, using the color difference information.
  • the above image processing device can realize more versatile color component adjustment than the conventional image processing device.
  • FIG. 1 is a block diagram showing an example of a main configuration of the image processing apparatus 100.
  • the image processing device 100 is a device that processes an input input image 1a to generate an output image, and is, for example, a television, a personal computer, a smartphone, a tablet terminal, or a recording device (for example,). , Hard disk recorder), etc.
  • the image processing device 100 includes, for example, a control unit 110.
  • the control unit 110 has a function of comprehensively controlling various functions of the image processing device 100, for example, an image conversion unit 101, an image storage unit 102, a reference image generation unit 103, a color difference information calculation unit 104, and a color component adjustment. Part 105 may be included.
  • the input image 1a is the second color gamut.
  • the input image 1a may be converted so that the input image 1a can be expressed by, and the converted input image 1b may be output to the image storage unit 102.
  • the input image 1a and the input image 1b may be a still image or a moving image. Details of the processing executed by the image conversion unit 101 will be described later.
  • the image storage unit 102 may store the input image 1b input from the image conversion unit 101, for example.
  • the image storage unit 102 can output the input image 1b to the reference image generation unit 103, the color difference information calculation unit 104, and the color component adjustment unit 105 at the timing of each processing.
  • the image processing device 100 can process the input image 1b at an appropriate timing.
  • the reference image generation unit 103 converts the reference image 1c in which the color components of at least some of the pixels included in the input image 1b input from the image storage unit 102 are changed into the input image 1b. Based on this, the generated reference image 1c may be output to the color difference information calculation unit 104.
  • the reference image generation unit 103 generates a color-reduced image 2a (see FIG. 2) in which the color components of the plurality of pixels included in the input image 1b are reduced, and then includes the color-reduced image 2a.
  • a colored image in which a color component is added to each of the plurality of pixels may be generated as a reference image 1c. The details of the process executed by the reference image generation unit 103 will be described later.
  • the image processing apparatus 100 generates, for example, a color-reduced image 2a by reducing the color component of the input image 1b, and then outputs a colored image in which the color component is added to the color-reduced image 2a as a reference image 1c. You can do it. As a result, the image processing apparatus 100 can generate a reference image 1c in which the color components of at least a part of the plurality of pixels included in the input image 1b are changed.
  • the color components of the plurality of pixels included in the input image 1b may be, for example, components other than the components corresponding to the luminance in each pixel value.
  • the Cb and Cr components are "color components”
  • the H and S components are “color components”
  • the CIELAB color space the a * and b * components. Is the "color component”.
  • the definition of color components is not limited to these examples.
  • the color difference information calculation unit 104 includes, for example, a plurality of pixels included in the input image 1b input from the image storage unit 102 and a plurality of pixels included in the reference image 1c input from the reference image generation unit 103.
  • the color difference information 1d based on the color difference may be calculated.
  • the color difference information calculation unit 104 may output the calculated color difference information 1d to the color component adjustment unit 105. Details of the processing executed by the color difference information calculation unit 104 will be described later.
  • the color component adjusting unit 105 generates an output image 1e in which the color components of a plurality of pixels included in the input image 1b are adjusted, using the color difference information 1d input from the color difference information calculation unit 104, and outputs the output image 1e.
  • Image 1e may be output.
  • the color component adjusting unit 105 may increase the saturation of the plurality of pixels as the distance (color difference value) corresponding to each of the plurality of pixels included in the input image 1b increases. Details of the processing executed by the color component adjusting unit 105 will be described later.
  • FIG. 2 is a block diagram showing an example of the configuration of a main part of the reference image generation unit 103 included in the image processing device 100.
  • the reference image generation unit 103 may include a color reduction unit 221 and a coloring unit 222.
  • the coloring unit 222 may include, for example, a first deep learning device 223 and a storage unit 224.
  • the color reduction unit (reference image generation unit) 221 may generate, for example, a color reduction image 2a in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced.
  • a color reduction image 2a in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced.
  • the color reduction unit 221 generates the color reduction image 2a by setting the color components of the plurality of pixels included in the input image 1b to zero will be described. That is, in the first embodiment, the color reduction image 2a is a monochrome image.
  • the color reduction unit 221 may be generated as a color reduction image 2a by, for example, linearly or non-linearly converting the luminance values of a plurality of pixels included in the input image 1b by any method. In this case, the color reduction unit 221 executes an operation so that only the luminance value is generated and deletes the color component, so that the process is equivalent to the process of making the color component zero.
  • the coloring unit (reference image generation unit) 222 may generate, for example, a coloring image in which color components are added to a plurality of pixels included in the color reduction image 2a as the reference image 1c.
  • a coloring image in which color components are added to a plurality of pixels included in the color reduction image 2a as the reference image 1c.
  • the coloring unit 222 adds a color component to the color-reduced image 2a by using the first deep learning device (first learned model) 223 defined by the first model parameter 225.
  • the coloring unit 222 adds a color component to the monochrome image by the first deep learning device 223.
  • the first deep learner 223 may be, for example, a neural network (for example, a multi-layer perceptron (including a deep model)). Further, the first model parameter 225 may be, for example, data including the connection weight of the neural network constituting the first deep learner 223.
  • the first model parameter 225 is optimized as follows, for example. First, the image processing device 100 collects color source images in which various subjects are captured by an appropriate method (for example, randomly) from an arbitrary information source (for example, the Internet). Next, the image processing device 100 generates a color-reduced image 2a by a process executed by the color-reducing unit 221 on the color source image. As a result, a combination in which the color-reduced image 2a is input data and the color source image is correct data is obtained as teacher data. Then, the image processing device 100 optimizes the first model parameter 225 by training the first deep learning device 223 using the teacher data.
  • the teacher data used for learning of the first deep learning device 223 can be created from images having a narrow color gamut that already exists in many places in the world. Therefore, the image processing device 100 can easily prepare the teacher data. Since the ability to prepare a large amount of teacher data means that the learning accuracy of the deep learning device can be improved, the image processing apparatus 100 can perform color component adjustment processing with higher accuracy.
  • the image processing device 100 may, for example, convert the color gamut of the color source image and generate the color-reduced image 2a from the converted image.
  • the teacher data is diverse. If there is enough teacher data randomly sampled from the population (for example, all the images in the world), it can be assumed that the set of teacher data reflects the nature of the population.
  • the accuracy of the color that humans memorize for an object depends on the frequency of personal contact with the object, but if the above diversity is properly secured, the color of the universally distributed object will be. It is reproduced with high accuracy in the reference image 1c. In other words, the image processing device 100 may hardly emphasize the color of the object reflected in the input image 1b, for example.
  • the colors of objects that are not universally distributed do not have to be reproduced with high accuracy in the reference image 1c.
  • the color of an object having a large color variation (for example, an object whose color changes depending on the season and the weather, such as the color of a mountain and the sky) does not have to be reproduced with high accuracy in the reference image 1c.
  • the image processing device 100 may, for example, greatly emphasize the color of the object reflected in the input image 1b and generate the adjusted image as the output image 1e.
  • the image processing device 100 utilizes this property of human beings to distinguish between a universally distributed object, a non-universally distributed object, and an object having a large color variation, and inputs so as to emphasize only the latter color. Adjust image 1a.
  • the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance.
  • the first model parameter 225 is stored in the storage unit 224. Further, for example, before the first model parameter 225 is incorporated into the image processing device 100, the first deep learning device 223 is trained in advance outside the image processing device 100 (a device other than the image processing device 100). May be generated by The storage unit 224 only needs to be a storage device capable of reading and writing arbitrary data, and its configuration and form are not limited. The first model parameter 225 may be incorporated into the first deep learning device 223 from the storage unit 224, for example, during the operation of the first deep learning device 223.
  • FIG. 3 is a block diagram showing a main configuration of the color difference information calculation unit 104 included in the image processing device 100.
  • the color difference information calculation unit 104 may include a first color space conversion unit 321, a second color space conversion unit 322, a color difference calculation unit 323, and a color difference normalization unit 324.
  • the color difference information calculation unit 104 determines the pixel value of each pixel of the input image 1b and each pixel of the reference image 1c at a position corresponding to each pixel of the input image 1b in a predetermined color space.
  • a plurality of color difference values which are distances from the pixel values of the above, are calculated, and the plurality of color difference values are included in the color difference information 1d and output.
  • the first color space conversion unit 321 may, for example, convert the pixel value of each pixel of the reference image 1c into a pixel value in a predetermined color space.
  • the second color space conversion unit 322 may, for example, convert the pixel value of each pixel of the input image 1b into the pixel value in a predetermined color space.
  • the predetermined color space in the first color space conversion unit 321 and the predetermined color space in the second color space conversion unit 322 may be the same color space or different color spaces. May be good.
  • the transformation may be a linear transformation or a non-linear transformation appropriately set according to the purpose.
  • the predetermined color space may be, for example, a uniform color space.
  • the predetermined color space is the CIELAB color space, which is one of the uniform color spaces.
  • the predetermined color space may be a CIELUV color space or a color space with higher uniformity accuracy such as CIE2000.
  • the predetermined color space may be a color space other than the uniform color space (for example, YCbCr color space).
  • the color difference calculation unit (color difference information calculation unit) 323 corresponds to, for example, the pixel values of a plurality of pixels included in the input image 1b and the pixel values of the plurality of pixels included in the reference image 1c corresponding to the plurality of pixels.
  • the distances may be calculated in a predetermined color space, and the distances may be included as a part of the color difference information 1d.
  • the color difference calculation unit 323 may use the distance as the color difference value of the pixel, for example.
  • the above distance may be any distance such as the Euclidean distance, the Manhattan distance, and the squared distance.
  • the color difference calculation unit 323 can calculate a plurality of color difference values, for example, the color difference values may be calculated for all the pixels of the input image 1b and the reference image 1c. Details of the processing executed by the color difference calculation unit 323 will be described later.
  • the color difference normalization unit 324 may, for example, normalize a plurality of color difference values obtained by the color difference calculation unit 323, and output the plurality of color difference values after normalization in the color difference information 1d.
  • the color difference normalization unit 324 performs an operation of zeroing a color difference value equal to or less than a predetermined value among a plurality of color difference values and subtracting a predetermined value from a color difference value larger than the predetermined value among the plurality of color difference values. The result of is included in the color difference information 1d and output.
  • the image processing device 100 can distinguish between the part of the object that is universally distributed in the input image 1b and the part of the object that is not universally distributed and the part of the object having a large color variation depending on the magnitude of the color difference value. It works.
  • FIG. 4 is a block diagram showing the configuration of the color component adjusting unit 105.
  • the color component adjusting unit 105 may include a third color space conversion unit 421, a saturation adjusting unit 422, and a color space inverse conversion unit 423.
  • the third color space conversion unit (color component adjustment unit) 421 may generate a second image by, for example, converting the input image 1b into an image in a color space having saturation as an axis component.
  • the color space may be, for example, an HSL color space, an HSV color space, or the like.
  • the saturation adjusting unit (color component adjusting unit) 422 may generate an output image 1e based on the converted image, for example. Specifically, the saturation adjustment unit 422 colors each pixel of the second image input from the third color space conversion unit 421 according to the color difference information 1d input from the color difference information calculation unit 104, for example. A third image with adjusted degree components may be generated.
  • the saturation adjustment unit 422 may adjust the saturation of each pixel of the second image according to the magnitude of the color difference value included in the color difference information 1d, for example. Specifically, the saturation adjusting unit 422 may adjust the saturation so that the adjusted saturation is in the range of 0.0 to 1.0. Alternatively, the saturation adjusting unit 422 may emphasize the saturation more strongly, for example, as the pixel has a larger color difference value. Then, the saturation adjustment unit 422 may output, for example, an image in which the HSL pixel value of each pixel is (H, S2, L) as the above-mentioned third image.
  • the color space inverse conversion unit (color component adjustment unit) 423 converts, for example, a third image input from the saturation adjustment unit 422 into an image in the color space of the input image 1b to generate an output image 1e, and the output image 1e is generated.
  • the output image 1e may be output.
  • the color space inverse conversion unit 423 converts, for example, a third image in which the HSL pixel value of each pixel is (H, S2, L) into an image in the RGB color space which is the color space of the input image 1b. , The output image 1e may be generated. That is, while the third color space conversion unit 421 converts the image from the RGB color space to the HSL color space, the color space inverse conversion unit 423 may execute the reverse conversion.
  • the third color space conversion unit 421 converts the input image 1b into a relative color space having saturation as an axis component and then adjusts the color component
  • the color space inverse conversion unit 423 is easy. Can be converted back to. Therefore, the image processing device 100 can simplify the above adjustment / inverse conversion process.
  • the color component adjusting unit 105 converts, for example, the input image 1b into a second image in the color space having saturation as an axis component, and converts each pixel of the second image into color difference information 1d.
  • a third image whose saturation component is adjusted accordingly may be generated, and the third image may be inversely converted into the color space of the input image 1b to generate the output image 1e. That is, the color component adjusting unit 105 may perform the saturation adjustment processing of each pixel of the input image 1b as the color component adjusting processing of each pixel of the input image 1b.
  • the color gamut targeted by the color component adjustment treatment is sufficiently wide. This is because the wider the color gamut, the greater the degree of freedom in the color component adjustment process. That is, the image processing apparatus 100 may convert the input image 1b into an image having a wide color gamut, for example, in the initial stage of a series of processing. As a result, the image processing apparatus 100 can prevent the pixel value of each pixel included in the image from deviating from the value outside the color range in the processing after the conversion processing.
  • the color gamut targeted by each process included in the image processing apparatus 100 is defined by the ITU-R recommendation BT. It may be the color gamut specified in 2020 (hereinafter referred to as Rec.2020).
  • the input image 1b is the ITU-R recommendation BT. It is an image in an RGB color space conforming to the color gamut defined in 709 (hereinafter referred to as Rec.709), and each process included in the image processing apparatus 100 is a Rec.
  • the image conversion unit 101 converts the image into an image in the XYZ color space, and rec. Converts to an image in an RGB color space compliant with 2020.
  • the image conversion unit 101 executes the same processing. By doing so, the image can be converted.
  • the image conversion unit 101 does not need to execute the color gamut conversion process. Therefore, the image conversion unit 101 may send the input image 1a to the image storage unit 102 without converting the input image 1a. Further, when the conversion process becomes unnecessary, for example, the color gamut of the input image 1a is determined in advance to one type, the image processing device 100 does not have to include the image conversion unit 101.
  • the color gamut of the input image 1a is wider than the target color gamut for each process included in the image processing device 100, it is necessary to perform a process of fitting an image having a wide color gamut into a narrow color gamut.
  • Various well-known prior arts can be applied to the treatment.
  • the image converted by the image conversion unit 101 is described in Rec. An image in an RGB color space compliant with 2020.
  • the color difference value generated by the color difference information calculation unit 104 is a relatively small value in the universally distributed object portion. Therefore, the color component adjusting unit 105 hardly changes the saturation of the pixel corresponding to the object which is uncomfortable for human beings when the color component changes.
  • the color difference value generated by the color difference information calculation unit 104 is a relatively large value in the portion of the object that is not universally distributed and the object that has a large color variation. For this reason, the color component adjusting unit 105 greatly changes the saturation of the pixels corresponding to the object with less discomfort. That is, the color component adjusting unit 105 increases the saturation of each pixel of the input image 1b as the color difference value with respect to the pixel increases.
  • the image processing device 100 can execute the color component adjusting process on the input image 1a so as not to give a human discomfort to any object in the image.
  • FIG. 5 is a flowchart showing an example of processing executed by the image processing apparatus 100.
  • the image conversion unit 101 converts the input image 1a into an image in the color gamut used in the processing of the image processing device 100 (S501), if necessary.
  • the reference image generation unit 103 generates a reference image 1c in which the color components of at least a part of the pixels included in the input image 1b are changed based on the input image 1b (S502).
  • the color difference information calculation unit 104 calculates the color difference information 1d based on the color difference between the plurality of pixels included in the input image 1b and the plurality of pixels included in the reference image 1c (S503).
  • the color component adjusting unit 105 generates an output image 1e in which the color components of the plurality of pixels included in the input image 1b are adjusted by using the color difference information 1d (S504), and outputs the output image 1e. (S505).
  • FIG. 6 is a flowchart showing an example of processing executed by the reference image generation unit 103 included in the image processing device 100.
  • FIG. 6 is a flowchart illustrating an example of step S502 included in the flowchart illustrated in FIG. 5 in more detail.
  • a color-reduced image 2a in which the color components of the plurality of pixels included in the input image 1b are reduced is generated (S601). Then, the coloring unit 222 generates a colored image in which a color component is added to each of the plurality of pixels included in the reduced color image 2a as the reference image 1c (S602).
  • FIG. 7 is a flowchart showing an example of the processing executed by the color difference information calculation unit 104 provided in the image processing apparatus 100.
  • FIG. 7 is a flowchart illustrating an example of step S503 included in the flowchart illustrated in FIG. 5 in more detail.
  • the first color space conversion unit 321 converts the pixel value of each pixel of the reference image 1c into the pixel value in a predetermined color space (S701).
  • the second color space conversion unit 322 converts the pixel value of each pixel of the input image 1b into the pixel value in a predetermined color space (S702).
  • the color difference calculation unit 323 calculates the distance between the pixel values in the predetermined color space at the same pixel position for each pixel of the input image 1b and the reference image 1c, and sets this as a plurality of color difference values ( S703). That is, the color difference calculation unit 323 corresponds to the pixel values of the plurality of pixels included in the input image 1b and the pixel values of the plurality of pixels included in the reference image 1c corresponding to the plurality of pixels. Are calculated in a predetermined color space.
  • the color difference normalization unit 324 includes a plurality of color difference values after normalization in the color difference information 1d and outputs them (S704).
  • the color difference calculation unit 323 may, for example, normalize a plurality of color difference values obtained by the color difference calculation unit 323.
  • the processes of S701 and S702 may be performed in a different order or in parallel.
  • FIG. 8 is a flowchart showing an example of processing executed by the color component adjusting unit 105 included in the image processing apparatus 100.
  • FIG. 8 is a flowchart illustrating an example of step S504 included in the flowchart illustrated in FIG. 5 in more detail.
  • the third color space conversion unit 421 converts the input image 1b into an image in the color space whose axis component is saturation (S801).
  • the saturation adjustment unit 422 generates an image (third image) in which the saturation components of a plurality of pixels included in the converted image (second image) are adjusted according to the color difference information 1d (third image).
  • S802 The color space inverse conversion unit 423 generates an output image 1e by inversely converting the third image into the color space of the input image 1b (S803).
  • the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance in the image.
  • the control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 903 instead of the reference image generation unit 103.
  • FIG. 9 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 903.
  • the reference image generation unit 903 includes a color reduction unit 921 in place of the color reduction unit 221.
  • the storage unit 224 stores the first model parameter 925 in place of the first model parameter 225.
  • the color reduction unit (reference image generation unit) 921 may generate, for example, a color reduction image 2a in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced.
  • the color reduction unit 221 generates a color reduction image 2a (monochrome image) by setting the color components of a plurality of pixels included in the input image 1b to zero has been described.
  • the color reduction unit 921 generates the color reduction image 2a by making the color components of the plurality of pixels included in the input image 1b larger than zero will be described. That is, the color reduction image 2a generated by the color reduction unit 921 is an image in which the color is reduced so that the color component remains. In other words, the color reduction unit 921 leaves a clue of the color component for the coloring unit 222 to color in the color reduction image 2a.
  • the first model parameter 925 is optimized in the same way as the first model parameter 225. However, the first model parameter 925 is optimized using the color reduction image 2a generated by the color reduction unit 921 as input data. That is, the first model parameter 925 optimizes the combination of the color-reduced image 2a in which the clue for coloring is left and the color source image as teacher data.
  • the image processing device 100 can correctly color the first deep learning device 223 defined by the first model parameter 925. That is, the image processing device 100 can reduce the probability of executing the color component adjusting process that makes humans feel uncomfortable, and increase the versatility of the color component adjusting process.
  • FIGS. 1, 2, and 10 A third embodiment (Embodiment 3) will be described with reference to FIGS. 1, 2, and 10.
  • the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
  • the control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 1003 instead of the reference image generation unit 103.
  • FIG. 10 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 1003. As illustrated in FIG. 10, the reference image generation unit 1003 includes a color reduction unit 1021 in place of the color reduction unit 221.
  • the color reduction unit (reference image generation unit) 1021 may generate, for example, a color reduction image 2d in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced.
  • the color reduction unit 1021 includes a second deep learner (second trained model) 1026 defined by the second model parameter 1028 and a storage unit 1027.
  • the storage unit 1027 stores the second model parameter 1028.
  • the storage unit 224 stores the first model parameter 1025.
  • the second deep learner 1026 may be, for example, a neural network (for example, a multi-layer perceptron (including a deep model)). Further, the second model parameter 1028 may be, for example, data including the connection weight of the neural network constituting the second deep learner 1026.
  • the second model parameter 1028 is optimized as follows, for example.
  • the image processing device 100 collects color source images (input image 1a) in which various subjects are captured by an appropriate method (for example, randomly) from an arbitrary information source (for example, the Internet).
  • an appropriate method for example, randomly
  • an arbitrary information source for example, the Internet
  • the image processing device 100 converts the color gamut of the color source image by a process executed by the image conversion unit 101.
  • the image processing device 100 generates a color-reduced image 2a in which the color source image is reduced by a process executed by the color-reducing unit 221.
  • the reduced color image 2a may be a monochrome image as in the first embodiment.
  • the image processing device 100 optimizes the second model parameter 1028 by training the second deep learning device 1026 using the teacher data.
  • the first model parameter 1025 is optimized as follows, for example. First, the image processing device 100 converts the color gamut of the color source image by a process executed by the image conversion unit 101. Next, the image processing device 100 generates a color-reduced image 2d obtained by reducing the color of the converted color source image by a process executed by the second deep learning device 1026 defined by the second model parameter 1028.
  • the image processing device 100 optimizes the first model parameter 1025 by training the first deep learning device 223 using the teacher data.
  • the image processing device 100 may proceed with the optimization so that the second deep learning device 1026 can completely infer the monochrome image, for example, or the optimization may proceed before the inference becomes possible. You may intentionally stop the conversion on the way. In the following, the explanation will be continued assuming the latter case.
  • the second deep learning device 1026 When learning (optimization of the second model parameter 1028) is stopped in the middle, the second deep learning device 1026 generates an image in which the color is reduced so that the color component of the input image 1b remains.
  • familiar (universally distributed) objects undergo color reduction even if the learning is shallow, and unfamiliar objects and objects with variations in color retain their original color components.
  • the object for which the second deep learning device 1026 could not completely reduce the color (the original color component remains) is an object for which it is difficult for the first deep learning device 223 to add the color component. To do. This is because the first model parameter 1025 and the second model parameter 1028 are parameters optimized using the teacher data generated from the same color source image.
  • the second deep learning device 1026 leaves clues for color components for coloring in the color-reduced image 2d according to the distribution of objects. Therefore, the image processing device 100 can correctly color the first deep learning device 223 defined by the first model parameter 1025. That is, the image processing device 100 can reduce the probability of executing the color component adjusting process that makes humans feel uncomfortable, and further enhance the versatility of the color component adjusting process.
  • FIGS. 11 and 12 A fourth embodiment (Embodiment 4) will be described with reference to FIGS. 11 and 12.
  • the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
  • FIG. 11 is a block diagram showing an example of the main configuration of the image processing apparatus 1100.
  • the image processing device 1100 is a device that processes the input input image 1a to generate the output image 1e, and includes, for example, a control unit 1110.
  • the control unit 1110 has a function of comprehensively controlling various functions of the image processing device 1100, and replaces the color component adjusting unit 105 (see FIG. 1) included in the control unit 110 (see FIG. 1) with a color component adjusting unit. Including part 1105. Further, the control unit 1110 may further include, for example, a user adjustment amount input unit 1106.
  • the user adjustment amount input unit 1106 may acquire, for example, an input related to the user adjustment amount 11a.
  • the user adjustment amount 11a is, for example, a color component input via an input device (for example, a keyboard, a remote controller displayed on the screen, a volume controller, etc.) connectable to the image processing device 1100 so as to be input. It may be an amount for adjusting the adjustment amount.
  • the user adjustment amount 11a may be, for example, a real number within the range of 0 to 2, or a discrete value of strong (representing 2 in real number), medium (representing 1), and weak (representing 0). There may be.
  • FIG. 12 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1105.
  • the color component adjusting unit 1105 may further include, for example, a color difference information adjusting unit 1224 in addition to each unit provided by the color component adjusting unit 105.
  • the color difference information adjustment unit (color component adjustment unit) 1224 further uses the user adjustment amount 11a to adjust the color components of a plurality of pixels included in the input image 1b to generate the output image 1e.
  • the color difference information adjustment unit 1224 outputs, for example, a value obtained by multiplying the color difference information 1d (color difference value) input from the color difference information calculation unit 104 by the user adjustment amount 11a to the saturation adjustment unit 422 as the adjusted color difference value. You can.
  • the image processing device 1100 allows the user to fine-tune the color component. Therefore, the image processing apparatus 1100 can further enhance the versatility of the color component adjusting process.
  • FIGS. 1 and 13 A fifth embodiment (Embodiment 5) will be described with reference to FIGS. 1 and 13.
  • the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
  • the control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 1303 instead of the reference image generation unit 103.
  • FIG. 13 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 1303.
  • the reference image generation unit 1303 further includes a model parameter replacement unit 1329.
  • the model parameter replacement unit 1329 may replace, for example, the first model parameter 225 with a model parameter (for example, model parameter 1330, model parameter 1331) different from the first model parameter 225.
  • the model parameter exchanging unit 1329 is, for example, a model parameter storage server communicably connected to the image processing device 100 via the network 1308.
  • the different model parameters are obtained from 1309 and the first model parameters 225 are replaced with the different model parameters.
  • the different model parameters are parameters optimized with teacher data different from the teacher data used for optimization with the first model parameter 225.
  • the method of accumulating the different model parameters is not limited to cloud storage such as the model parameter storage server 1309, and may be, for example, a local disk.
  • the image processing apparatus 100 with the reference image generator 1303 has a variety of teacher data (eg, according to predetermined criteria) from the first model parameter 225.
  • teacher data eg, according to predetermined criteria
  • the teacher data that includes only the images that meet the predetermined criteria as correct answer data, etc. the user can change the tendency of color component adjustment. Can be changed to suit your taste.
  • the model parameter replacement unit 1329 may replace, for example, the second model parameter 1028 with a model parameter different from the second model parameter 1028.
  • the image processing device 100 can realize a color component adjusting process that suits the user's taste. Therefore, the image processing device 100 can further enhance the versatility of the color component adjusting process.
  • FIGS. 1 and 14 to 15 A sixth embodiment (Embodiment 6) will be described with reference to FIGS. 1 and 14 to 15.
  • the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
  • the control unit 110 provided in the image processing device 100 may include, for example, the color difference information calculation unit 1404 instead of the color difference information calculation unit 104, and the color component adjustment unit 1505 instead of the color component adjustment unit 105.
  • FIG. 14 is a block diagram showing an example of the configuration of the main part of the color difference information calculation unit 1404.
  • the color difference information calculation unit 1404 includes a saturation difference calculation unit 1425 in place of the color difference calculation unit 323 and a saturation difference normalization unit 1426 in place of the color difference normalization unit 324. You can.
  • the saturation difference calculation unit (color difference information calculation unit) 1425 is used, for example, for the first saturation value calculated for each of the plurality of pixels included in the input image 1b and for the plurality of pixels included in the reference image 1c.
  • a plurality of saturation difference values indicating the difference from the second saturation value calculated in each case may be calculated.
  • the plurality of saturation difference values may take either a positive value or a negative value.
  • the saturation difference normalization unit (color difference information calculation unit) 1426 for example, normalizes the plurality of saturation difference values, and uses the normalized plurality of saturation difference values as a part of the color difference information 1d. It may be included and output.
  • the saturation difference normalization unit 1426 may normalize the saturation difference value based on a value obtained by multiplying the saturation difference value by an arbitrary constant.
  • FIG. 15 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1505. As illustrated in FIG. 15, the color component adjusting unit 1505 may include a saturation adjusting unit 1522 instead of the saturation adjusting unit 422.
  • the saturation adjustment unit (color component adjustment unit) 1522 may generate an output image 1e based on, for example, the converted image. Specifically, the saturation adjustment unit 1522 has, for example, a color difference for each pixel of the second image (for example, the input image 1b converted to the HSL color space) input from the third color space conversion unit 421. A third image in which the saturation component is adjusted according to the color difference information 1d input from the information calculation unit 1404 may be generated.
  • the saturation adjustment unit 1522 adjusts the saturation of each pixel of the second image output by the third color space conversion unit 421 according to the saturation difference value included in the color difference information 1d. You can. Specifically, the saturation adjustment unit 1522 may change the saturation of the plurality of pixels according to the positive / negative of the saturation difference value corresponding to each of the plurality of pixels included in the second image ( For example, if the saturation difference value is positive, the saturation may be increased, and if the saturation difference value is negative, the saturation may be decreased).
  • the color component adjusting unit 1505 adjusts the saturation by increasing or decreasing the saturation of the plurality of pixels included in the input image 1b, respectively, and generates the third image.
  • the image processing apparatus 100 can realize a wide range of color component adjustment processing, so that the versatility of the color component adjustment processing can be further enhanced.
  • FIGS. 1, 2, 4, 16, and 17 A seventh embodiment (Embodiment 7) will be described with reference to FIGS. 1, 2, 4, 16, and 17.
  • the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
  • the control unit 110 provided in the image processing device 100 may include, for example, the color difference information calculation unit 1604 instead of the color difference information calculation unit 104, and the color component adjustment unit 1705 instead of the color component adjustment unit 105.
  • FIG. 16 is a block diagram showing an example of the configuration of the main part of the color difference information calculation unit 1604.
  • the color difference information calculation unit 1604 replaces the first color space conversion unit 321 with the fourth color space conversion unit 1627, and replaces the second color space conversion unit 322 with the fifth.
  • the color space conversion unit 1628 may be provided with a hue difference calculation unit 1629 instead of the color difference calculation unit 323, and a hue difference normalization unit 1630 instead of the color difference normalization unit 324.
  • the fourth color space conversion unit 1627 and the fifth color space conversion unit 1628 may, for example, convert the reference image 1c and the input image 1b into images in a color space having hue as an axis component, respectively.
  • the color space may be, for example, an HSL color space, an HSV color space, or the like.
  • the transformation may be a linear transformation or a non-linear transformation appropriately set according to the purpose.
  • the hue difference calculation unit 1629 has a first hue value calculated for each of a plurality of pixels included in the input image 1b and a second hue calculated for each of the plurality of pixels included in the reference image 1c. A plurality of hue difference values indicating the difference from the values may be calculated. In other words, the hue difference calculation unit 1629 may calculate the hue difference between the reference image 1c and the input image 1b.
  • the hue difference normalization unit 1630 may, for example, normalize a plurality of hue difference values calculated by the hue difference calculation unit 1629, and include the plurality of normalized hue difference values in the color difference information 1d for output. For example, when the hue difference normalization unit 1630 is a value outside the predetermined range (for example, when it is larger than 6 °), the hue difference value may be replaced with zero and output.
  • the image processing apparatus 100 can further enhance the versatility of the color component adjusting process.
  • FIG. 17 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1705.
  • the color component adjusting unit 1705 may include, for example, a sixth color space conversion unit 1721, a hue adjusting unit 1724, and a color space inverse conversion unit 1723.
  • the sixth color space conversion unit (color component adjustment unit) 1721 may generate a second image by, for example, converting the input image 1b into an image in a color space having a hue as an axis component.
  • the color space may be, for example, an HSL color space, an HSV color space, or the like.
  • the hue adjustment unit (color component adjustment unit) 1724 may change the hues of the plurality of pixels included in the input image 1b according to the plurality of hue difference values input from the hue difference information calculation unit 1604, for example. Specifically, when the sixth color space conversion unit 1721 converts the input image 1b, the hue adjustment unit 1724 may generate the output image 1e based on the converted image. At this time, the hue adjusting unit 1724 may generate an image in which the hue components of a plurality of pixels included in the converted image are adjusted according to the color difference information 1d, for example.
  • the color space inverse conversion unit 1723 may, for example, inversely convert the image generated by the hue adjustment unit 1724 into the color space of the input image 1b.
  • the calculated hue difference for the pixels corresponding to the universally distributed object is smaller than the hue difference of other objects (objects that are not universally distributed, objects with large color variation, etc.).
  • the color component adjusting unit 1705 reduces the adjustment amount of the former color component and increases the latter.
  • the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance in the image.
  • FIG. 18 is a schematic view schematically showing an image display device 1801 provided with an image processing device according to each embodiment.
  • the image display device 1801 may be any device as long as it can display an arbitrary image, and may be, for example, a liquid crystal display, an organic EL display, or the like.
  • the image display device 1801 displays the output image 1e output by the image processing device on the display surface 1802, and presents the output image 1e to the user.
  • the control block provided in the image processing device may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be realized by a CPU (Central Processing Unit) or the like. It may be realized by software using a controller (processor). That is, each image processing device is a CPU that executes instructions of a control program that is software that realizes each function, the control program, and a ROM (Read Only Memory) in which various data are readablely recorded by a computer (or CPU). ) Or a storage device (these are referred to as "recording media"), a RAM (Random Access Memory) for deploying the control program, and the like.
  • a controller processor
  • the computer reads the control program from the recording medium and executes it, an example of the object according to each aspect of the present disclosure is achieved.
  • a "non-temporary tangible medium" for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the control program may be supplied to the computer via an arbitrary transmission medium (communication network, broadcast wave, etc.) capable of transmitting the control program.
  • Each aspect of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave, in which the control program is embodied by electronic transmission.
  • control program can be implemented in any programming language.
  • control program can be implemented using a script language such as ActionScript or JavaScript (registered trademark), an object-oriented programming language such as Objective-C or Java (registered trademark), or a markup language such as HTML5.
  • an information processing terminal for example, a smartphone or a personal computer having each part that realizes each function realized by the control program and a server having each part that realizes the remaining functions different from the above-mentioned functions are also available. It falls within the scope of this disclosure.

Abstract

Provided are an image processing device and the like for performing color component adjustment processing with greater versatility. The image processing device is provided with: a reference image generation unit that uses an input image as a basis to generate a reference image in which the color components of at least some of the pixels of the input image are changed; a color difference information calculation unit that calculates color difference information for each pixel between the input image and the reference image; and a color component adjustment unit that uses the color difference information as a basis to generate an output image in which the color components of each of the pixels of the input image are adjusted.

Description

画像処理装置、画像表示装置、画像処理方法、およびプログラムImage processing equipment, image display equipment, image processing methods, and programs
 本開示は、画像処理装置、画像表示装置、画像処理方法、およびプログラムに関する。本願は、2019年7月19日に日本に出願された特願2019-133404号に基づく優先権を主張するものであり、その内容をここに援用する。 The present disclosure relates to an image processing device, an image display device, an image processing method, and a program. The present application claims priority based on Japanese Patent Application No. 2019-133404 filed in Japan on July 19, 2019, the contents of which are incorporated herein by reference.
 下記の特許文献1は、低照度での撮影などにより輝度、色彩が劣化した入力顔画像に対しても輝度、色彩を復元して画質改善が図れる画像処理方法を開示している。 Patent Document 1 below discloses an image processing method capable of improving the image quality by restoring the brightness and color even for an input face image whose brightness and color have deteriorated due to shooting in low illuminance or the like.
特開2005-316743号公報(2005年11月10日公開)Japanese Unexamined Patent Publication No. 2005-316743 (published on November 10, 2005)
 特許文献1に記載の処理では、画像中の顔領域等のように、あらかじめ想定したオブジェクトに対してしか色彩調整が行えず、汎用性が低い。そこで、本開示の一形態は、例えば、より汎用性の高い色成分調整処理を行う画像処理装置等を提供することを目的とする。 In the process described in Patent Document 1, color adjustment can be performed only for an object assumed in advance, such as a face area in an image, and the versatility is low. Therefore, one aspect of the present disclosure is, for example, to provide an image processing apparatus or the like that performs more versatile color component adjustment processing.
 本開示の一態様に係る画像処理装置は、例えば、入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成部と、前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出部と、前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整部とを備えている。 The image processing apparatus according to one aspect of the present disclosure is, for example, a reference image that generates a reference image in which the color components of at least some of the pixels included in the input image are changed based on the input image. A color difference information calculation unit that calculates color difference information based on the color difference between the generation unit, the plurality of pixels included in the input image, and the plurality of pixels included in the reference image, and the color difference information calculation unit included in the input image. It is provided with a color component adjusting unit that generates an output image in which the color components of a plurality of pixels are adjusted by using the color difference information.
 本開示の一態様に係る画像処理装置の制御方法は、例えば、入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成ステップと、前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出ステップと、前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整ステップとを含む。 The control method of the image processing device according to one aspect of the present disclosure is, for example, to generate a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed based on the input image. The reference image generation step, the color difference information calculation step for calculating the color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image, and the input image It includes a color component adjusting step of generating an output image in which the color components of the plurality of pixels included are adjusted by using the color difference information.
 本開示の一態様に係る制御プログラムは、例えば、画像処理装置に、入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成機能と、前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出機能と、前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整機能とを実現させる。 In the control program according to one aspect of the present disclosure, for example, a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed is provided to an image processing device based on the input image. A reference image generation function to be generated, a color difference information calculation function for calculating color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image, and the input image. The output image in which the color components of the plurality of pixels included in the above are adjusted is realized with the color component adjusting function of generating the output image by using the color difference information.
実施形態1に係る画像処理装置の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the image processing apparatus which concerns on Embodiment 1. FIG. 上記画像処理装置が備えた参照画像生成部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the said image processing apparatus. 上記画像処理装置が備えた色差情報算出部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color difference information calculation part provided in the said image processing apparatus. 上記画像処理装置が備えた色成分調整部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color component adjustment part provided in the said image processing apparatus. 上記画像処理装置が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the processing executed by the said image processing apparatus. 上記画像処理装置が備えた参照画像生成部が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the processing executed by the reference image generation part provided in the said image processing apparatus. 上記画像処理装置が備えた色差情報算出部が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the processing executed by the color difference information calculation unit provided in the image processing apparatus. 上記画像処理装置が備えた色成分調整部が実行する処理の一例を示すフローチャートである。It is a flowchart which shows an example of the processing executed by the color component adjustment part provided in the said image processing apparatus. 実施形態2に係る画像処理装置が備えた参照画像生成部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the image processing apparatus which concerns on Embodiment 2. FIG. 実施形態3に係る画像処理装置が備えた参照画像生成部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the image processing apparatus which concerns on Embodiment 3. 実施形態4に係る画像処理装置の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the image processing apparatus which concerns on Embodiment 4. 上記画像処理装置が備えた色成分調整部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color component adjustment part provided in the said image processing apparatus. 実施形態5に係る画像処理装置が備えた参照画像生成部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the reference image generation part provided in the image processing apparatus which concerns on Embodiment 5. 実施形態6に係る画像処理装置が備えた色差情報算出部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color difference information calculation part provided in the image processing apparatus which concerns on Embodiment 6. 上記画像処理装置が備えた色成分調整部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color component adjustment part provided in the said image processing apparatus. 実施形態7に係る画像処理装置が備えた色差情報算出部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color difference information calculation part provided in the image processing apparatus which concerns on Embodiment 7. 上記画像処理装置が備えた色成分調整部の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the main part structure of the color component adjustment part provided in the said image processing apparatus. 各実施形態に係る画像処理装置を備えた画像表示装置を模式的に示す模式図である。It is a schematic diagram which shows typically the image display apparatus provided with the image processing apparatus which concerns on each embodiment.
 <画像処理装置の概要>
 画像表示装置(例えば、液晶テレビ、スマートフォンなど)に画像を美しく表示させるためには、当該画像の色彩を調整することが好ましい。しかし、画像に含まれるすべての画素の彩度を一律に強調するだけでは、鑑賞者に不自然な印象を与えかねない画像が生成されるおそれがある。彩度強調に応じた印象の変化は、当該画像に写るオブジェクトに対する当該鑑賞者の接触頻度に依存するためである。
<Overview of image processing equipment>
In order to display an image beautifully on an image display device (for example, a liquid crystal television, a smartphone, etc.), it is preferable to adjust the color of the image. However, if the saturation of all the pixels included in the image is uniformly emphasized, an image that may give an unnatural impression to the viewer may be generated. This is because the change in impression according to the saturation enhancement depends on the frequency of contact of the viewer with the object in the image.
 そこで、本開示の一態様に係る画像処理装置は、例えば、入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成してよい。次に、当該画像処理装置は、例えば、入力画像に含まれる複数の画素と、参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出してよい。そして、当該画像処理装置は、例えば、当該入力画像に含まれる複数の画素の色成分をそれぞれ色差情報に基づいて調整した出力画像を、当該色差情報を用いて生成してよい。 Therefore, the image processing apparatus according to one aspect of the present disclosure generates, for example, a reference image in which the color components of at least a part of the plurality of pixels included in the input image are changed based on the input image. You can. Next, the image processing device may calculate color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image, for example. Then, the image processing apparatus may generate, for example, an output image in which the color components of a plurality of pixels included in the input image are adjusted based on the color difference information, using the color difference information.
 したがって、上記画像処理装置は、従来の画像処理装置より汎用性の高い色成分調整を実現できる。 Therefore, the above image processing device can realize more versatile color component adjustment than the conventional image processing device.
 <実施形態1>
 図1~図8に基づいて、第1の実施形態(実施形態1)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 1>
The first embodiment (Embodiment 1) will be described with reference to FIGS. 1 to 8. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (画像処理装置100の全体構成)
 図1は、画像処理装置100の要部構成の一例を示すブロック図である。図1に例示されるように、画像処理装置100は、入力された入力画像1aを処理して出力画像を生成する装置であり、例えば、テレビ、パーソナルコンピュータ、スマートフォン、タブレット端末、録画装置(例えば、ハードディスクレコーダ)などであってよい。画像処理装置100は、例えば、制御部110を備える。
(Overall configuration of image processing device 100)
FIG. 1 is a block diagram showing an example of a main configuration of the image processing apparatus 100. As illustrated in FIG. 1, the image processing device 100 is a device that processes an input input image 1a to generate an output image, and is, for example, a television, a personal computer, a smartphone, a tablet terminal, or a recording device (for example,). , Hard disk recorder), etc. The image processing device 100 includes, for example, a control unit 110.
 制御部110は、画像処理装置100の各種機能を統括的に制御する機能を持ち、例えば、画像変換部101、画像蓄積部102、参照画像生成部103、色差情報算出部104、および色成分調整部105を含んでよい。 The control unit 110 has a function of comprehensively controlling various functions of the image processing device 100, for example, an image conversion unit 101, an image storage unit 102, a reference image generation unit 103, a color difference information calculation unit 104, and a color component adjustment. Part 105 may be included.
 画像変換部101は、例えば、入力画像1aを表現する第1の色域と画像処理装置100の処理で用いる第2の色域とが異なる場合に、当該入力画像1aが当該第2の色域で表現可能となるように当該入力画像1aを変換し、当該変換後の入力画像1bを画像蓄積部102に出力してよい。 In the image conversion unit 101, for example, when the first color gamut expressing the input image 1a and the second color gamut used in the processing of the image processing apparatus 100 are different, the input image 1a is the second color gamut. The input image 1a may be converted so that the input image 1a can be expressed by, and the converted input image 1b may be output to the image storage unit 102.
 なお、上記入力画像1aおよび入力画像1bは、静止画であっても動画であってもよい。画像変換部101が実行する処理の詳細は後述する。 The input image 1a and the input image 1b may be a still image or a moving image. Details of the processing executed by the image conversion unit 101 will be described later.
 画像蓄積部102は、例えば、画像変換部101から入力された入力画像1bを蓄積してよい。画像蓄積部102は、当該入力画像1bを、参照画像生成部103、色差情報算出部104、および色成分調整部105に、それぞれの処理のタイミングに合わせて出力できる。これにより、画像処理装置100は、適切なタイミングで入力画像1bを処理できる。 The image storage unit 102 may store the input image 1b input from the image conversion unit 101, for example. The image storage unit 102 can output the input image 1b to the reference image generation unit 103, the color difference information calculation unit 104, and the color component adjustment unit 105 at the timing of each processing. As a result, the image processing device 100 can process the input image 1b at an appropriate timing.
 参照画像生成部103は、例えば、画像蓄積部102から入力された入力画像1bに含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像1cを、当該入力画像1bに基づいて生成し、当該生成した参照画像1cを色差情報算出部104に出力してよい。 For example, the reference image generation unit 103 converts the reference image 1c in which the color components of at least some of the pixels included in the input image 1b input from the image storage unit 102 are changed into the input image 1b. Based on this, the generated reference image 1c may be output to the color difference information calculation unit 104.
 なお、参照画像生成部103は、後述するように、入力画像1bに含まれる複数の画素の色成分をそれぞれ減少させた減色画像2a(図2参照)を生成した後、当該減色画像2aに含まれる複数の画素に色成分をそれぞれ付加した彩色画像を、参照画像1cとして生成してよい。参照画像生成部103が実行する当該処理の詳細は後述する。 As will be described later, the reference image generation unit 103 generates a color-reduced image 2a (see FIG. 2) in which the color components of the plurality of pixels included in the input image 1b are reduced, and then includes the color-reduced image 2a. A colored image in which a color component is added to each of the plurality of pixels may be generated as a reference image 1c. The details of the process executed by the reference image generation unit 103 will be described later.
 以上のように、画像処理装置100は、例えば、入力画像1bの色成分を減少させることにより減色画像2aを生成した後、当該減色画像2aに色成分を付加した彩色画像を参照画像1cとして出力してよい。これにより、画像処理装置100は、当該入力画像1bに含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像1cを生成できる。 As described above, the image processing apparatus 100 generates, for example, a color-reduced image 2a by reducing the color component of the input image 1b, and then outputs a colored image in which the color component is added to the color-reduced image 2a as a reference image 1c. You can do it. As a result, the image processing apparatus 100 can generate a reference image 1c in which the color components of at least a part of the plurality of pixels included in the input image 1b are changed.
 なお、以下では、入力画像1bに含まれる複数の画素の色成分は、例えば、それぞれの画素値における輝度に相当する成分以外の成分であってよい。例えば、YCbCr色空間であれば、CbおよびCrの成分が「色成分」であり、HSL色空間ならHおよびSの成分が「色成分」であり、CIELAB色空間ならaおよびbの成分が「色成分」である。ただし、色成分の定義は、これらの例に限定されない。 In the following, the color components of the plurality of pixels included in the input image 1b may be, for example, components other than the components corresponding to the luminance in each pixel value. For example, in the YCbCr color space, the Cb and Cr components are "color components", in the HSL color space, the H and S components are "color components", and in the CIELAB color space, the a * and b * components. Is the "color component". However, the definition of color components is not limited to these examples.
 色差情報算出部104は、例えば、画像蓄積部102から入力された入力画像1bに含まれる複数の画素と、参照画像生成部103から入力された参照画像1cに含まれる複数の画素とのそれぞれの色差に基づく色差情報1dを算出してよい。色差情報算出部104は、当該算出した色差情報1dを色成分調整部105へ出力してよい。色差情報算出部104が実行する処理の詳細は後述する。 The color difference information calculation unit 104 includes, for example, a plurality of pixels included in the input image 1b input from the image storage unit 102 and a plurality of pixels included in the reference image 1c input from the reference image generation unit 103. The color difference information 1d based on the color difference may be calculated. The color difference information calculation unit 104 may output the calculated color difference information 1d to the color component adjustment unit 105. Details of the processing executed by the color difference information calculation unit 104 will be described later.
 色成分調整部105は、例えば、入力画像1bに含まれる複数の画素の色成分をそれぞれ調整した出力画像1eを、色差情報算出部104から入力された色差情報1dを用いて生成し、当該出力画像1eを出力してよい。色成分調整部105は、例えば、入力画像1bに含まれる複数の画素にそれぞれ対応する距離(色差値)が大きいほど、当該複数の画素の彩度を大きくしてよい。色成分調整部105が実行する処理の詳細は後述する。 For example, the color component adjusting unit 105 generates an output image 1e in which the color components of a plurality of pixels included in the input image 1b are adjusted, using the color difference information 1d input from the color difference information calculation unit 104, and outputs the output image 1e. Image 1e may be output. For example, the color component adjusting unit 105 may increase the saturation of the plurality of pixels as the distance (color difference value) corresponding to each of the plurality of pixels included in the input image 1b increases. Details of the processing executed by the color component adjusting unit 105 will be described later.
 (参照画像生成部103の構成)
 図2は、画像処理装置100が備えた参照画像生成部103の要部構成の一例を示すブロック図である。図2に例示されるように、参照画像生成部103は、減色部221および彩色部222を備えてよい。また、彩色部222は、例えば、第1の深層学習器223、および記憶部224を備えてよい。
(Structure of reference image generation unit 103)
FIG. 2 is a block diagram showing an example of the configuration of a main part of the reference image generation unit 103 included in the image processing device 100. As illustrated in FIG. 2, the reference image generation unit 103 may include a color reduction unit 221 and a coloring unit 222. Further, the coloring unit 222 may include, for example, a first deep learning device 223 and a storage unit 224.
 減色部(参照画像生成部)221は、例えば、画像蓄積部102から入力された入力画像1bに含まれる複数の画素の色成分をそれぞれ減少させた減色画像2aを生成してよい。実施形態1においては、減色部221が入力画像1bに含まれる複数の画素の色成分を、それぞれゼロにすることにより減色画像2aを生成する例を説明する。すなわち、実施形態1において、減色画像2aはモノクロ画像である。 The color reduction unit (reference image generation unit) 221 may generate, for example, a color reduction image 2a in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced. In the first embodiment, an example in which the color reduction unit 221 generates the color reduction image 2a by setting the color components of the plurality of pixels included in the input image 1b to zero will be described. That is, in the first embodiment, the color reduction image 2a is a monochrome image.
 なお、減色部221は、例えば、入力画像1bに含まれる複数の画素の輝度値を任意の方法でそれぞれ線形変換・非線形変換することにより、減色画像2aとして生成してよい。この場合、減色部221は、当該輝度値のみが生成されるように演算を実行し、色成分を削除するため、当該処理は当該色成分をゼロにする処理と等価である。 The color reduction unit 221 may be generated as a color reduction image 2a by, for example, linearly or non-linearly converting the luminance values of a plurality of pixels included in the input image 1b by any method. In this case, the color reduction unit 221 executes an operation so that only the luminance value is generated and deletes the color component, so that the process is equivalent to the process of making the color component zero.
 彩色部(参照画像生成部)222は、例えば、減色画像2aに含まれる複数の画素に色成分をそれぞれ付加した彩色画像を、参照画像1cとして生成してよい。実施形態1においては、彩色部222が、第1のモデルパラメータ225によって規定される第1の深層学習器(第1の学習済みモデル)223を用いて、減色画像2aに色成分を付加する例を説明する。すなわち、彩色部222は、第1の深層学習器223により、上記モノクロ画像に色成分を付加する。 The coloring unit (reference image generation unit) 222 may generate, for example, a coloring image in which color components are added to a plurality of pixels included in the color reduction image 2a as the reference image 1c. In the first embodiment, an example in which the coloring unit 222 adds a color component to the color-reduced image 2a by using the first deep learning device (first learned model) 223 defined by the first model parameter 225. To explain. That is, the coloring unit 222 adds a color component to the monochrome image by the first deep learning device 223.
 第1の深層学習器223は、例えば、ニューラルネットワークであってよい(例えば、多層パーセプトロン(深層モデルを含む)など)。また、第1のモデルパラメータ225は、例えば、第1の深層学習器223を構成するニューラルネットワークの結合重みを含むデータであってよい。 The first deep learner 223 may be, for example, a neural network (for example, a multi-layer perceptron (including a deep model)). Further, the first model parameter 225 may be, for example, data including the connection weight of the neural network constituting the first deep learner 223.
 第1のモデルパラメータ225は、例えば、次のように最適化される。まず、画像処理装置100は、任意の情報源(例えば、インターネット)から適当な方法で(例えば、ランダムに)、様々な被写体が写ったカラー元画像を収集する。次に、画像処理装置100は、当該カラー元画像を減色部221が実行する処理により減色画像2aを生成する。これにより、当該減色画像2aを入力データ、当該カラー元画像を正解データとする組み合わせが、教師データとして得られる。そして、画像処理装置100は、当該教師データを用いて第1の深層学習器223を学習させることにより、第1のモデルパラメータ225を最適化する。 The first model parameter 225 is optimized as follows, for example. First, the image processing device 100 collects color source images in which various subjects are captured by an appropriate method (for example, randomly) from an arbitrary information source (for example, the Internet). Next, the image processing device 100 generates a color-reduced image 2a by a process executed by the color-reducing unit 221 on the color source image. As a result, a combination in which the color-reduced image 2a is input data and the color source image is correct data is obtained as teacher data. Then, the image processing device 100 optimizes the first model parameter 225 by training the first deep learning device 223 using the teacher data.
 ここで、上記情報源をインターネットにする場合、第1の深層学習器223の学習に用いる教師データは、既に世の中に多く存在する狭い色域の画像から作成できる。このため、画像処理装置100は、上記教師データを準備することが容易である。大量の教師データを準備できることは、深層学習器の学習の精度を上げられるということであるため、画像処理装置100は、より精度の高い色成分調整処理を行うことができる。 Here, when the above information source is the Internet, the teacher data used for learning of the first deep learning device 223 can be created from images having a narrow color gamut that already exists in many places in the world. Therefore, the image processing device 100 can easily prepare the teacher data. Since the ability to prepare a large amount of teacher data means that the learning accuracy of the deep learning device can be improved, the image processing apparatus 100 can perform color component adjustment processing with higher accuracy.
 なお、教師データにおける減色画像2aを生成する過程において、画像処理装置100は、例えば、上記カラー元画像の色域を変換し、当該変換後の画像から減色画像2aを生成してもよい。また、教師データには多様性があることが好ましい。母集団(例えば、世の中に存在するすべての画像)からランダムにサンプリングされた教師データが十分にあれば、当該教師データの集合は当該母集団の性質を反映していると仮定できるからである。 In the process of generating the color-reduced image 2a in the teacher data, the image processing device 100 may, for example, convert the color gamut of the color source image and generate the color-reduced image 2a from the converted image. Also, it is preferable that the teacher data is diverse. If there is enough teacher data randomly sampled from the population (for example, all the images in the world), it can be assumed that the set of teacher data reflects the nature of the population.
 人間がオブジェクトに対して記憶している色彩の精度は、当該オブジェクトに対する個人的な接触頻度に依存するところ、上記多様性が適切に担保されていれば、普遍的に分布するオブジェクトの色彩は、参照画像1cにおいて高い精度で再現される。言い換えれば、画像処理装置100は、例えば、入力画像1bに写る当該オブジェクトに対しては、ほとんど色彩を強調しない場合がある。 The accuracy of the color that humans memorize for an object depends on the frequency of personal contact with the object, but if the above diversity is properly secured, the color of the universally distributed object will be. It is reproduced with high accuracy in the reference image 1c. In other words, the image processing device 100 may hardly emphasize the color of the object reflected in the input image 1b, for example.
 一方で、普遍的に分布しないオブジェクト(例えば、個々の環境に依存して偏在するオブジェクト)の色彩は、参照画像1cにおいて高い精度では再現されなくともよい。また、色のばらつきの多いオブジェクト(例えば、山および空の色のように、季節および天気等によって色が変化するオブジェクト)の色彩は、参照画像1cにおいて高い精度では再現されなくとも良い。言い換えれば、画像処理装置100は、例えば、入力画像1bに写る当該オブジェクトに対して大きく色彩を強調し、当該調整後の画像を出力画像1eとして生成する場合がある。 On the other hand, the colors of objects that are not universally distributed (for example, objects that are unevenly distributed depending on the individual environment) do not have to be reproduced with high accuracy in the reference image 1c. Further, the color of an object having a large color variation (for example, an object whose color changes depending on the season and the weather, such as the color of a mountain and the sky) does not have to be reproduced with high accuracy in the reference image 1c. In other words, the image processing device 100 may, for example, greatly emphasize the color of the object reflected in the input image 1b and generate the adjusted image as the output image 1e.
 人間は、見慣れた(普遍的に分布する)オブジェクトの色彩が強調されると違和感を覚える反面、見慣れないオブジェクトおよび色のばらつきの多いオブジェクトの色彩が強調されても違和感を覚えにくい。画像処理装置100は、人間が有するこの性質を利用し、普遍的に分布するオブジェクトと、普遍的に分布しないオブジェクトおよび色のばらつきの多いオブジェクトとを見分け、後者の色彩のみを強調するように入力画像1aを調整する。 Humans feel uncomfortable when the colors of familiar (universally distributed) objects are emphasized, but it is difficult for humans to feel uncomfortable even when the colors of unfamiliar objects and objects with many color variations are emphasized. The image processing device 100 utilizes this property of human beings to distinguish between a universally distributed object, a non-universally distributed object, and an object having a large color variation, and inputs so as to emphasize only the latter color. Adjust image 1a.
 これにより、画像処理装置100は、あらかじめ想定したオブジェクトに対してしか色彩調整が行えない従来の画像処理装置より、汎用性の高い色成分調整を実現できる。 As a result, the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance.
 なお、図2に例示されるように、第1のモデルパラメータ225は、記憶部224に格納されている。また、第1のモデルパラメータ225は、例えば、画像処理装置100に組み込まれる前に、画像処理装置100の外部(画像処理装置100以外の装置)で、あらかじめ第1の深層学習器223を学習させることによって生成されてよい。記憶部224は、任意のデータを読み書き可能な記憶装置でありさえすればよく、その構成・形態などは限定されない。第1のモデルパラメータ225は、例えば、第1の深層学習器223の動作時に、記憶部224から第1の深層学習器223に組み込まれてよい。 As illustrated in FIG. 2, the first model parameter 225 is stored in the storage unit 224. Further, for example, before the first model parameter 225 is incorporated into the image processing device 100, the first deep learning device 223 is trained in advance outside the image processing device 100 (a device other than the image processing device 100). May be generated by The storage unit 224 only needs to be a storage device capable of reading and writing arbitrary data, and its configuration and form are not limited. The first model parameter 225 may be incorporated into the first deep learning device 223 from the storage unit 224, for example, during the operation of the first deep learning device 223.
 (色差情報算出部104の構成)
 図3は、画像処理装置100が備えた色差情報算出部104の要部構成を示すブロック図である。図3に例示されるように、色差情報算出部104は、第1の色空間変換部321、第2の色空間変換部322、色差算出部323、および色差正規化部324を備えてよい。
(Structure of Color Difference Information Calculation Unit 104)
FIG. 3 is a block diagram showing a main configuration of the color difference information calculation unit 104 included in the image processing device 100. As illustrated in FIG. 3, the color difference information calculation unit 104 may include a first color space conversion unit 321, a second color space conversion unit 322, a color difference calculation unit 323, and a color difference normalization unit 324.
 なお、実施形態1においては、色差情報算出部104が、所定の色空間において、入力画像1bの各画素の画素値と、当該入力画像1bの各画素に対応する位置の参照画像1cの各画素の画素値との間の距離である複数の色差値を算出し、当該複数の色差値を、色差情報1dに含めて出力する例を説明する。 In the first embodiment, the color difference information calculation unit 104 determines the pixel value of each pixel of the input image 1b and each pixel of the reference image 1c at a position corresponding to each pixel of the input image 1b in a predetermined color space. An example will be described in which a plurality of color difference values, which are distances from the pixel values of the above, are calculated, and the plurality of color difference values are included in the color difference information 1d and output.
 第1の色空間変換部321は、例えば、参照画像1cの各画素の画素値を、所定の色空間における画素値に変換してよい。また、第2の色空間変換部322は、例えば、入力画像1bの各画素の画素値を、所定の色空間における画素値に変換してよい。ここで、第1の色空間変換部321における所定の色空間と、第2の色空間変換部322における所定の色空間は、同一の色空間であってもよいし、異なる色空間であってもよい。また、当該変換は、目的に応じて適宜設定された線形変換・非線形変換であってよい。 The first color space conversion unit 321 may, for example, convert the pixel value of each pixel of the reference image 1c into a pixel value in a predetermined color space. Further, the second color space conversion unit 322 may, for example, convert the pixel value of each pixel of the input image 1b into the pixel value in a predetermined color space. Here, the predetermined color space in the first color space conversion unit 321 and the predetermined color space in the second color space conversion unit 322 may be the same color space or different color spaces. May be good. Further, the transformation may be a linear transformation or a non-linear transformation appropriately set according to the purpose.
 なお、上記所定の色空間は、例えば、均等色空間であってよい。実施形態1においては、当該所定の色空間が、均等色空間の一つであるCIELAB色空間であると仮定する。ただし、当該所定の色空間は、CIELUV色空間であってもよいし、CIE2000のようなさらに均等性の精度をあげた色空間であってもよい。また、当該所定の色空間は、均等色空間以外の色空間(例えば、YCbCr色空間など)であってもよい。 The predetermined color space may be, for example, a uniform color space. In the first embodiment, it is assumed that the predetermined color space is the CIELAB color space, which is one of the uniform color spaces. However, the predetermined color space may be a CIELUV color space or a color space with higher uniformity accuracy such as CIE2000. Further, the predetermined color space may be a color space other than the uniform color space (for example, YCbCr color space).
 色差算出部(色差情報算出部)323は、例えば、入力画像1bに含まれる複数の画素の画素値と、当該複数の画素にそれぞれ対応し、参照画像1cに含まれる複数の画素の画素値との距離を所定の色空間においてそれぞれ算出し、当該距離を色差情報1dの一部の情報として含めてよい。なお、色差算出部323は、例えば、当該距離を当該画素の色差値としてよい。 The color difference calculation unit (color difference information calculation unit) 323 corresponds to, for example, the pixel values of a plurality of pixels included in the input image 1b and the pixel values of the plurality of pixels included in the reference image 1c corresponding to the plurality of pixels. The distances may be calculated in a predetermined color space, and the distances may be included as a part of the color difference information 1d. The color difference calculation unit 323 may use the distance as the color difference value of the pixel, for example.
 ここで、上記距離は、例えば、ユークリッド距離、マンハッタン距離、二乗距離など、任意の距離であってよい。また、色差算出部323は複数の色差値を算出できるため、例えば、入力画像1bと参照画像1cの全ての画素について色差値を算出してよい。色差算出部323が実行する処理の詳細は後述する。 Here, the above distance may be any distance such as the Euclidean distance, the Manhattan distance, and the squared distance. Further, since the color difference calculation unit 323 can calculate a plurality of color difference values, for example, the color difference values may be calculated for all the pixels of the input image 1b and the reference image 1c. Details of the processing executed by the color difference calculation unit 323 will be described later.
 色差正規化部324は、例えば、色差算出部323が求めた複数の色差値を正規化し、正規化後の複数の色差値を色差情報1dに含めて出力してよい。色差正規化部324は、例えば、複数の色差値のうち所定値以下の色差値をゼロにするとともに、複数の色差値のうち所定値より大きい色差値から所定値を減算する演算を行い、演算の結果を色差情報1dに含めて出力してよい。 The color difference normalization unit 324 may, for example, normalize a plurality of color difference values obtained by the color difference calculation unit 323, and output the plurality of color difference values after normalization in the color difference information 1d. For example, the color difference normalization unit 324 performs an operation of zeroing a color difference value equal to or less than a predetermined value among a plurality of color difference values and subtracting a predetermined value from a color difference value larger than the predetermined value among the plurality of color difference values. The result of is included in the color difference information 1d and output.
 (色差情報算出部104の色差値に関する効果)
 前述のとおり、参照画像生成部103が生成した参照画像1cにおいては、普遍的に分布するオブジェクトの色彩は、高い精度で再現される。このため、色差情報算出部104が生成する普遍的に分布するオブジェクトの部分の色差値は、比較的小さな値となる。また、参照画像生成部103が生成した参照画像1cにおいては、普遍的に分布しないオブジェクトおよび色のばらつきの多いオブジェクトの色彩は、高い精度では再現されなくてもよい。このため、色差情報算出部104が生成する普遍的に分布しないオブジェクトおよび色のばらつきの多いオブジェクトの部分の色差値は、比較的大きな値となる。
(Effect of color difference value of color difference information calculation unit 104)
As described above, in the reference image 1c generated by the reference image generation unit 103, the colors of the universally distributed objects are reproduced with high accuracy. Therefore, the color difference value of the universally distributed object portion generated by the color difference information calculation unit 104 is a relatively small value. Further, in the reference image 1c generated by the reference image generation unit 103, the colors of objects that are not universally distributed and objects that have a large color variation do not have to be reproduced with high accuracy. Therefore, the color difference value of the part of the object that is not universally distributed and the object that has a large color variation generated by the color difference information calculation unit 104 is a relatively large value.
 したがって、画像処理装置100は、色差値の大きさによって、入力画像1bにおいて普遍的に分布するオブジェクトの部分と普遍的に分布しないオブジェクトおよび色のばらつきの多いオブジェクトの部分とを見分けることができるという効果を奏する。 Therefore, the image processing device 100 can distinguish between the part of the object that is universally distributed in the input image 1b and the part of the object that is not universally distributed and the part of the object having a large color variation depending on the magnitude of the color difference value. It works.
 (色成分調整部105の構成)
 図4は、色成分調整部105の構成を示すブロック図である。図4に例示されるように、色成分調整部105は、第3の色空間変換部421、彩度調整部422、および色空間逆変換部423を備えてよい。
(Structure of color component adjusting unit 105)
FIG. 4 is a block diagram showing the configuration of the color component adjusting unit 105. As illustrated in FIG. 4, the color component adjusting unit 105 may include a third color space conversion unit 421, a saturation adjusting unit 422, and a color space inverse conversion unit 423.
 第3の色空間変換部(色成分調整部)421は、例えば、彩度を軸成分として有する色空間における画像に入力画像1bを変換することにより、第2の画像を生成してよい。ここで、当該色空間は、例えば、HSL色空間、HSV色空間などであってよい。 The third color space conversion unit (color component adjustment unit) 421 may generate a second image by, for example, converting the input image 1b into an image in a color space having saturation as an axis component. Here, the color space may be, for example, an HSL color space, an HSV color space, or the like.
 彩度調整部(色成分調整部)422は、例えば、上記変換後の画像に基づいて出力画像1eを生成してよい。具体的には、彩度調整部422は、例えば、第3の色空間変換部421から入力された第2の画像の各画素について、色差情報算出部104から入力した色差情報1dに応じて彩度成分を調整した第3の画像を生成してよい。 The saturation adjusting unit (color component adjusting unit) 422 may generate an output image 1e based on the converted image, for example. Specifically, the saturation adjustment unit 422 colors each pixel of the second image input from the third color space conversion unit 421 according to the color difference information 1d input from the color difference information calculation unit 104, for example. A third image with adjusted degree components may be generated.
 彩度調整部422は、例えば、第2の画像の各画素について、色差情報1dに含まれる色差値の大きさに応じて彩度を調整してよい。具体的には、彩度調整部422は、調整後の彩度が0.0~1.0の範囲になるように、当該彩度を調整してよい。または、彩度調整部422は、例えば、色差値が大きい画素ほど強く彩度を強調してよい。そして、彩度調整部422は、例えば、各画素のHSL画素値が(H、S2、L)である画像を、上記の第3の画像として出力してよい。 The saturation adjustment unit 422 may adjust the saturation of each pixel of the second image according to the magnitude of the color difference value included in the color difference information 1d, for example. Specifically, the saturation adjusting unit 422 may adjust the saturation so that the adjusted saturation is in the range of 0.0 to 1.0. Alternatively, the saturation adjusting unit 422 may emphasize the saturation more strongly, for example, as the pixel has a larger color difference value. Then, the saturation adjustment unit 422 may output, for example, an image in which the HSL pixel value of each pixel is (H, S2, L) as the above-mentioned third image.
 色空間逆変換部(色成分調整部)423は、例えば、彩度調整部422から入力した第3の画像を、入力画像1bの色空間における画像に変換して出力画像1eを生成し、当該出力画像1eを出力してよい。 The color space inverse conversion unit (color component adjustment unit) 423 converts, for example, a third image input from the saturation adjustment unit 422 into an image in the color space of the input image 1b to generate an output image 1e, and the output image 1e is generated. The output image 1e may be output.
 色空間逆変換部423は、例えば、各画素のHSL画素値が(H、S2、L)である第3の画像を、入力画像1bの色空間であるRGB色空間における画像に変換することにより、出力画像1eを生成してよい。すなわち、第3の色空間変換部421が、画像をRGB色空間からHSL色空間に変換したのに対し、色空間逆変換部423はその逆の変換を実行してよい。 The color space inverse conversion unit 423 converts, for example, a third image in which the HSL pixel value of each pixel is (H, S2, L) into an image in the RGB color space which is the color space of the input image 1b. , The output image 1e may be generated. That is, while the third color space conversion unit 421 converts the image from the RGB color space to the HSL color space, the color space inverse conversion unit 423 may execute the reverse conversion.
 ここで、第3の色空間変換部421が、彩度を軸成分として有する相対的な色空間に入力画像1bを変換してから色成分を調整した場合、色空間逆変換部423は、容易に逆変換できる。したがって、画像処理装置100は、上記調整・逆変換の処理を単純化できる。 Here, when the third color space conversion unit 421 converts the input image 1b into a relative color space having saturation as an axis component and then adjusts the color component, the color space inverse conversion unit 423 is easy. Can be converted back to. Therefore, the image processing device 100 can simplify the above adjustment / inverse conversion process.
 以上のように、色成分調整部105は、例えば、入力画像1bを、彩度を軸成分として有する色空間における第2の画像に変換し、第2の画像の各画素について、色差情報1dに応じて彩度成分を調整した第3の画像を生成し、第3の画像を入力画像1bの色空間に逆変換して出力画像1eを生成してよい。すなわち、色成分調整部105は、入力画像1bの各画素の色成分調整処理として、入力画像1bの各画素の彩度の調整処理を行ってよい。 As described above, the color component adjusting unit 105 converts, for example, the input image 1b into a second image in the color space having saturation as an axis component, and converts each pixel of the second image into color difference information 1d. A third image whose saturation component is adjusted accordingly may be generated, and the third image may be inversely converted into the color space of the input image 1b to generate the output image 1e. That is, the color component adjusting unit 105 may perform the saturation adjustment processing of each pixel of the input image 1b as the color component adjusting processing of each pixel of the input image 1b.
 (画像の色域およびその変換)
 色成分調整処理が対象とする色域は、十分広いことが好ましい。色域が広いほど、色成分調整処理の自由度が高まるからである。すなわち、画像処理装置100は、例えば、一連の処理の初期段階において、入力画像1bを広い色域の画像に変換してよい。これにより、画像処理装置100は、当該変換後の処理より後の処理において、画像に含まれる各画素の画素値が、色域外の値に外れることを防止できる。例えば、画像処理装置100に含まれる各処理が対象とする色域は、ITU-R勧告BT.2020(以下、Rec.2020と称する)に規定された色域であってよい。
(Image color gamut and its conversion)
It is preferable that the color gamut targeted by the color component adjustment treatment is sufficiently wide. This is because the wider the color gamut, the greater the degree of freedom in the color component adjustment process. That is, the image processing apparatus 100 may convert the input image 1b into an image having a wide color gamut, for example, in the initial stage of a series of processing. As a result, the image processing apparatus 100 can prevent the pixel value of each pixel included in the image from deviating from the value outside the color range in the processing after the conversion processing. For example, the color gamut targeted by each process included in the image processing apparatus 100 is defined by the ITU-R recommendation BT. It may be the color gamut specified in 2020 (hereinafter referred to as Rec.2020).
 例えば、入力画像1bがITU-R勧告BT.709(以下、Rec.709と称する)に規定された色域に準拠するRGB色空間における画像であり、画像処理装置100に含まれる各処理が、Rec.2020に準拠したRGB色空間で処理される場合を考える。この場合、画像変換部101は、当該画像からXYZ色空間における画像に変換し、当該変換後の画像からRec.2020に準拠したRGB色空間における画像に変換する。 For example, the input image 1b is the ITU-R recommendation BT. It is an image in an RGB color space conforming to the color gamut defined in 709 (hereinafter referred to as Rec.709), and each process included in the image processing apparatus 100 is a Rec. Consider the case of processing in an RGB color space compliant with 2020. In this case, the image conversion unit 101 converts the image into an image in the XYZ color space, and rec. Converts to an image in an RGB color space compliant with 2020.
 例えば、入力画像1bがRGB色空間以外の色空間(例えば、YCbCr色空間)における画像であり、当該画像からRGB色空間における画像に変換する場合も、画像変換部101は、同様の処理を実行することにより、当該画像を変換できる。 For example, when the input image 1b is an image in a color space other than the RGB color space (for example, YCbCr color space) and the image is converted into an image in the RGB color space, the image conversion unit 101 executes the same processing. By doing so, the image can be converted.
 なお、入力画像1aの色域と画像処理装置100に含まれる各処理が対象とする色域とが同じ場合、画像変換部101は色域変換処理を実行する必要はない。そのため、画像変換部101は、入力画像1aを変換することなく、当該入力画像1aを画像蓄積部102に送ってよい。また、例えば、入力画像1aの色域があらかじめ1種類に決まっているなど、変換処理が不要となる場合、画像処理装置100は画像変換部101を備えなくともよい。 If the color gamut of the input image 1a and the target color gamut of each process included in the image processing device 100 are the same, the image conversion unit 101 does not need to execute the color gamut conversion process. Therefore, the image conversion unit 101 may send the input image 1a to the image storage unit 102 without converting the input image 1a. Further, when the conversion process becomes unnecessary, for example, the color gamut of the input image 1a is determined in advance to one type, the image processing device 100 does not have to include the image conversion unit 101.
 入力画像1aの色域が画像処理装置100に含まれる各処理が対象とする色域より広い場合、色域が広い画像を狭い色域に収める処理が必要となる。当該処理には、周知の種々の従来技術が適用可能である。なお、以下では特に断りのない限り、画像変換部101によって変換処理を受けた画像はRec.2020に準拠したRGB色空間における画像とする。 When the color gamut of the input image 1a is wider than the target color gamut for each process included in the image processing device 100, it is necessary to perform a process of fitting an image having a wide color gamut into a narrow color gamut. Various well-known prior arts can be applied to the treatment. In the following, unless otherwise specified, the image converted by the image conversion unit 101 is described in Rec. An image in an RGB color space compliant with 2020.
 (色成分調整部105の彩度調整に関する効果)
 前述のとおり、色差情報算出部104が生成した色差値は、普遍的に分布するオブジェクトの部分では比較的小さな値となる。このため、色成分調整部105は、色成分が変化すると人間にとって違和感があるオブジェクトに対応する画素の彩度を、ほとんど変化させない。一方で、色差情報算出部104が生成した色差値は、普遍的に分布しないオブジェクトおよび色のばらつきの多いオブジェクトの部分では比較的大きな値となる。このため、色成分調整部105は、違和感が少ないオブジェクトに対応する画素の彩度を、大きく変化させる。すなわち、色成分調整部105は、入力画像1bの画素ごとに、当該画素に関する色差値が大きいほど前記画素の彩度を大きくする。
(Effects related to saturation adjustment of color component adjusting unit 105)
As described above, the color difference value generated by the color difference information calculation unit 104 is a relatively small value in the universally distributed object portion. Therefore, the color component adjusting unit 105 hardly changes the saturation of the pixel corresponding to the object which is uncomfortable for human beings when the color component changes. On the other hand, the color difference value generated by the color difference information calculation unit 104 is a relatively large value in the portion of the object that is not universally distributed and the object that has a large color variation. For this reason, the color component adjusting unit 105 greatly changes the saturation of the pixels corresponding to the object with less discomfort. That is, the color component adjusting unit 105 increases the saturation of each pixel of the input image 1b as the color difference value with respect to the pixel increases.
 これにより、画像処理装置100は、入力画像1aに対して、画像内のいずれのオブジェクトについても人に違和感を与えることがないように、色成分調整処理を実行できる。 As a result, the image processing device 100 can execute the color component adjusting process on the input image 1a so as not to give a human discomfort to any object in the image.
 (画像処理装置100が実行する処理)
 図5は、画像処理装置100が実行する処理の一例を示すフローチャートである。図5に例示されるように、画像変換部101は、必要に応じて入力画像1aを画像処理装置100の処理で用いる色域における画像へと変換する(S501)。
(Processing executed by the image processing device 100)
FIG. 5 is a flowchart showing an example of processing executed by the image processing apparatus 100. As illustrated in FIG. 5, the image conversion unit 101 converts the input image 1a into an image in the color gamut used in the processing of the image processing device 100 (S501), if necessary.
 参照画像生成部103は、入力画像1bに含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像1cを、当該入力画像1bに基づいて生成する(S502)。色差情報算出部104は、当該入力画像1bに含まれる当該複数の画素と、当該参照画像1cに含まれる複数の画素とのそれぞれの色差に基づく色差情報1dを算出する(S503)。色成分調整部105は、当該入力画像1bに含まれる当該複数の画素の色成分をそれぞれ調整した出力画像1eを、当該色差情報1dを用いて生成し(S504)、当該出力画像1eを出力する(S505)。 The reference image generation unit 103 generates a reference image 1c in which the color components of at least a part of the pixels included in the input image 1b are changed based on the input image 1b (S502). The color difference information calculation unit 104 calculates the color difference information 1d based on the color difference between the plurality of pixels included in the input image 1b and the plurality of pixels included in the reference image 1c (S503). The color component adjusting unit 105 generates an output image 1e in which the color components of the plurality of pixels included in the input image 1b are adjusted by using the color difference information 1d (S504), and outputs the output image 1e. (S505).
 (参照画像生成部103が実行する処理)
 図6は、画像処理装置100が備えた参照画像生成部103が実行する処理の一例を示すフローチャートである。図6は、図5に例示されるフローチャートに含まれるステップS502の一例を、より詳細に説明するフローチャートである。
(Process executed by the reference image generation unit 103)
FIG. 6 is a flowchart showing an example of processing executed by the reference image generation unit 103 included in the image processing device 100. FIG. 6 is a flowchart illustrating an example of step S502 included in the flowchart illustrated in FIG. 5 in more detail.
 図6に例示されるように、上記入力画像1bに含まれる上記複数の画素の色成分をそれぞれ減少させた減色画像2aを生成する(S601)。そして、彩色部222は、当該減色画像2aに含まれる複数の画素に色成分をそれぞれ付加した彩色画像を、上記参照画像1cとして生成する(S602)。 As illustrated in FIG. 6, a color-reduced image 2a in which the color components of the plurality of pixels included in the input image 1b are reduced is generated (S601). Then, the coloring unit 222 generates a colored image in which a color component is added to each of the plurality of pixels included in the reduced color image 2a as the reference image 1c (S602).
 (色差情報算出部104が実行する処理)
 図7は、画像処理装置100が備えた色差情報算出部104が実行する処理の一例を示すフローチャートである。図7は、図5に例示されるフローチャートに含まれるステップS503の一例を、より詳細に説明するフローチャートである。
(Process executed by the color difference information calculation unit 104)
FIG. 7 is a flowchart showing an example of the processing executed by the color difference information calculation unit 104 provided in the image processing apparatus 100. FIG. 7 is a flowchart illustrating an example of step S503 included in the flowchart illustrated in FIG. 5 in more detail.
 図7に例示されるように、第1の色空間変換部321は、参照画像1cの各画素の画素値を、所定の色空間における画素値に変換する(S701)。第2の色空間変換部322は、入力画像1bの各画素の画素値を、所定の色空間における画素値に変換する(S702)。 As illustrated in FIG. 7, the first color space conversion unit 321 converts the pixel value of each pixel of the reference image 1c into the pixel value in a predetermined color space (S701). The second color space conversion unit 322 converts the pixel value of each pixel of the input image 1b into the pixel value in a predetermined color space (S702).
 色差算出部323は、入力画像1bと参照画像1cとの各画素について、同じ画素位置のそれぞれの、上記所定の色空間における画素値間の距離を算出し、これを複数の色差値とする(S703)。すなわち、色差算出部323は、当該入力画像1bに含まれる上記複数の画素の画素値と、当該複数の画素にそれぞれ対応し、当該参照画像1cに含まれる上記複数の画素の画素値との距離を所定の色空間においてそれぞれ算出する。 The color difference calculation unit 323 calculates the distance between the pixel values in the predetermined color space at the same pixel position for each pixel of the input image 1b and the reference image 1c, and sets this as a plurality of color difference values ( S703). That is, the color difference calculation unit 323 corresponds to the pixel values of the plurality of pixels included in the input image 1b and the pixel values of the plurality of pixels included in the reference image 1c corresponding to the plurality of pixels. Are calculated in a predetermined color space.
 色差正規化部324は、正規化後の複数の色差値を色差情報1dに含めて出力する(S704)。このとき、色差算出部323は、例えば、色差算出部323が求めた複数の色差値を正規化してよい。なお、上記のS701およびS702の処理は、順序を入れ替えてもよいし、並列して処理してもよい。 The color difference normalization unit 324 includes a plurality of color difference values after normalization in the color difference information 1d and outputs them (S704). At this time, the color difference calculation unit 323 may, for example, normalize a plurality of color difference values obtained by the color difference calculation unit 323. The processes of S701 and S702 may be performed in a different order or in parallel.
 (色成分調整部105が実行する処理)
 図8は、画像処理装置100が備えた色成分調整部105が実行する処理の一例を示すフローチャートである。図8は、図5に例示されるフローチャートに含まれるステップS504の一例を、より詳細に説明するフローチャートである。
(Process executed by the color component adjusting unit 105)
FIG. 8 is a flowchart showing an example of processing executed by the color component adjusting unit 105 included in the image processing apparatus 100. FIG. 8 is a flowchart illustrating an example of step S504 included in the flowchart illustrated in FIG. 5 in more detail.
 図8に例示されるように、第3の色空間変換部421は、彩度を軸成分とする色空間における画像に入力画像1bを変換する(S801)。彩度調整部422は、当該変換後の画像(第2の画像)に含まれる複数の画素の彩度成分を、色差情報1dに応じてそれぞれ調整した画像(第3の画像)を生成する(S802)。色空間逆変換部423は、当該第3の画像を当該入力画像1bの色空間に逆変換することによって、出力画像1eを生成する(S803)。 As illustrated in FIG. 8, the third color space conversion unit 421 converts the input image 1b into an image in the color space whose axis component is saturation (S801). The saturation adjustment unit 422 generates an image (third image) in which the saturation components of a plurality of pixels included in the converted image (second image) are adjusted according to the color difference information 1d (third image). S802). The color space inverse conversion unit 423 generates an output image 1e by inversely converting the third image into the color space of the input image 1b (S803).
 (画像処理装置100の効果)
 前述したように、人間は、見慣れたオブジェクトの色彩が強調されると違和感を覚える反面、見慣れないオブジェクトおよび色にばらつきのあるオブジェクトの色彩が強調されても違和感を覚えにくい。画像処理装置100は、人間が有するこの性質を利用し、普遍的に分布するオブジェクトと、普遍的に分布しないオブジェクトおよび色にばらつきのあるオブジェクトとを学習し、後者の色彩のみを強調するように入力画像1aを調整する。
(Effect of image processing device 100)
As described above, human beings feel uncomfortable when the colors of familiar objects are emphasized, but they are less likely to feel uncomfortable when the colors of unfamiliar objects and objects with variations in color are emphasized. The image processing apparatus 100 utilizes this property of human beings to learn universally distributed objects, non-universally distributed objects, and objects having color variations, and emphasizes only the latter color. Adjust the input image 1a.
 これにより、画像処理装置100は、画像内のあらかじめ想定したオブジェクトに対してしか色彩調整が行えない従来の画像処理装置より、汎用性の高い色成分調整を実現できる。 As a result, the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance in the image.
 <実施形態2>
 図1、図2、および図9に基づいて、第2の実施形態(実施形態2)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 2>
A second embodiment (Embodiment 2) will be described with reference to FIGS. 1, 2, and 9. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (参照画像生成部903の構成)
 画像処理装置100が備えた制御部110は、例えば、参照画像生成部103に代えて、参照画像生成部903を含んでよい。
(Structure of reference image generation unit 903)
The control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 903 instead of the reference image generation unit 103.
 図9は、参照画像生成部903の要部構成の一例を示すブロック図である。図9に例示されるように、参照画像生成部903は、減色部221に代えて、減色部921を備える。また、記憶部224は、第1のモデルパラメータ225に代えて、第1のモデルパラメータ925を格納する。 FIG. 9 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 903. As illustrated in FIG. 9, the reference image generation unit 903 includes a color reduction unit 921 in place of the color reduction unit 221. Further, the storage unit 224 stores the first model parameter 925 in place of the first model parameter 225.
 減色部(参照画像生成部)921は、例えば、画像蓄積部102から入力された入力画像1bに含まれる複数の画素の色成分をそれぞれ減少させた減色画像2aを生成してよい。 The color reduction unit (reference image generation unit) 921 may generate, for example, a color reduction image 2a in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced.
 ここで、実施形態1においては、減色部221が入力画像1bに含まれる複数の画素の色成分を、それぞれゼロにすることにより減色画像2a(モノクロ画像)を生成する例を説明した。一方で、実施形態2においては、減色部921が入力画像1bに含まれる複数の画素の色成分を、それぞれゼロより大きくすることにより減色画像2aを生成する例を説明する。すなわち、減色部921により生成される減色画像2aは、色成分が残るように減色された画像である。言い換えれば、減色部921は、彩色部222が彩色するための色成分の手掛かりを減色画像2aに残す。 Here, in the first embodiment, an example in which the color reduction unit 221 generates a color reduction image 2a (monochrome image) by setting the color components of a plurality of pixels included in the input image 1b to zero has been described. On the other hand, in the second embodiment, an example in which the color reduction unit 921 generates the color reduction image 2a by making the color components of the plurality of pixels included in the input image 1b larger than zero will be described. That is, the color reduction image 2a generated by the color reduction unit 921 is an image in which the color is reduced so that the color component remains. In other words, the color reduction unit 921 leaves a clue of the color component for the coloring unit 222 to color in the color reduction image 2a.
 第1のモデルパラメータ925は、第1のモデルパラメータ225と同じ方法で最適化される。ただし、第1のモデルパラメータ925は、減色部921によって生成された減色画像2aを入力データとして最適化される。すなわち、第1のモデルパラメータ925は、彩色の手掛かりが残された減色画像2aとカラー元画像との組み合わせを教師データとして最適化される。 The first model parameter 925 is optimized in the same way as the first model parameter 225. However, the first model parameter 925 is optimized using the color reduction image 2a generated by the color reduction unit 921 as input data. That is, the first model parameter 925 optimizes the combination of the color-reduced image 2a in which the clue for coloring is left and the color source image as teacher data.
 したがって、画像処理装置100は、第1のモデルパラメータ925によって規定される第1の深層学習器223に、正しく彩色させることができる。すなわち、画像処理装置100は、人間が違和感を覚える色成分調整処理を実行する確率を下げ、当該色成分調整処理の汎用性を高めることができる。 Therefore, the image processing device 100 can correctly color the first deep learning device 223 defined by the first model parameter 925. That is, the image processing device 100 can reduce the probability of executing the color component adjusting process that makes humans feel uncomfortable, and increase the versatility of the color component adjusting process.
 <実施形態3>
 図1、図2、および図10に基づいて、第3の実施形態(実施形態3)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 3>
A third embodiment (Embodiment 3) will be described with reference to FIGS. 1, 2, and 10. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (参照画像生成部1003の構成)
 画像処理装置100が備えた制御部110は、例えば、参照画像生成部103に代えて、参照画像生成部1003を含んでよい。
(Structure of reference image generation unit 1003)
The control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 1003 instead of the reference image generation unit 103.
 図10は、参照画像生成部1003の要部構成の一例を示すブロック図である。図10に例示されるように、参照画像生成部1003は、減色部221に代えて、減色部1021を備える。 FIG. 10 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 1003. As illustrated in FIG. 10, the reference image generation unit 1003 includes a color reduction unit 1021 in place of the color reduction unit 221.
 減色部(参照画像生成部)1021は、例えば、画像蓄積部102から入力された入力画像1bに含まれる複数の画素の色成分をそれぞれ減少させた減色画像2dを生成してよい。減色部1021は、第2のモデルパラメータ1028によって規定される第2の深層学習器(第2の学習済みモデル)1026と、記憶部1027とを含む。記憶部1027は、当該第2のモデルパラメータ1028を格納している。また、記憶部224は、第1のモデルパラメータ1025を格納している。 The color reduction unit (reference image generation unit) 1021 may generate, for example, a color reduction image 2d in which the color components of a plurality of pixels included in the input image 1b input from the image storage unit 102 are reduced. The color reduction unit 1021 includes a second deep learner (second trained model) 1026 defined by the second model parameter 1028 and a storage unit 1027. The storage unit 1027 stores the second model parameter 1028. In addition, the storage unit 224 stores the first model parameter 1025.
 なお、第2の深層学習器1026は、例えば、ニューラルネットワークであってよい(例えば、多層パーセプトロン(深層モデルを含む)など)。また、第2のモデルパラメータ1028は、例えば、第2の深層学習器1026を構成するニューラルネットワークの結合重みを含むデータであってよい。 The second deep learner 1026 may be, for example, a neural network (for example, a multi-layer perceptron (including a deep model)). Further, the second model parameter 1028 may be, for example, data including the connection weight of the neural network constituting the second deep learner 1026.
 第2のモデルパラメータ1028は、例えば、次のように最適化される。まず、画像処理装置100は、任意の情報源(例えば、インターネット)から適当な方法で(例えば、ランダムに)、様々な被写体が写ったカラー元画像(入力画像1a)を収集する。 The second model parameter 1028 is optimized as follows, for example. First, the image processing device 100 collects color source images (input image 1a) in which various subjects are captured by an appropriate method (for example, randomly) from an arbitrary information source (for example, the Internet).
 次に、画像処理装置100は、画像変換部101が実行する処理により、上記カラー元画像の色域を変換する。次に、画像処理装置100は、減色部221が実行する処理により、当該カラー元画像を減色した減色画像2aを生成する。なお、実施形態3において、当該減色画像2aは、実施形態1と同様にモノクロ画像であってよい。 Next, the image processing device 100 converts the color gamut of the color source image by a process executed by the image conversion unit 101. Next, the image processing device 100 generates a color-reduced image 2a in which the color source image is reduced by a process executed by the color-reducing unit 221. In the third embodiment, the reduced color image 2a may be a monochrome image as in the first embodiment.
 これにより、上記変換後のカラー元画像を入力データ、上記減色画像2aを正解データとする組み合わせが、教師データとして得られる。そして、画像処理装置100は、当該教師データを用いて第2の深層学習器1026を学習させることにより、第2のモデルパラメータ1028を最適化する。 As a result, a combination in which the converted color source image is input data and the color reduction image 2a is correct data is obtained as teacher data. Then, the image processing device 100 optimizes the second model parameter 1028 by training the second deep learning device 1026 using the teacher data.
 第1のモデルパラメータ1025は、例えば、次のように最適化される。まず、画像処理装置100は、画像変換部101が実行する処理により、上記カラー元画像の色域を変換する。次に、画像処理装置100は、第2のモデルパラメータ1028によって規定される第2の深層学習器1026が実行する処理により、当該変換後のカラー元画像を減色した減色画像2dを生成する。 The first model parameter 1025 is optimized as follows, for example. First, the image processing device 100 converts the color gamut of the color source image by a process executed by the image conversion unit 101. Next, the image processing device 100 generates a color-reduced image 2d obtained by reducing the color of the converted color source image by a process executed by the second deep learning device 1026 defined by the second model parameter 1028.
 これにより、上記減色画像2dを入力データ、上記変換後のカラー元画像を正解データとする組み合わせが、教師データとして得られる。そして、画像処理装置100は、当該教師データを用いて第1の深層学習器223を学習させることにより、第1のモデルパラメータ1025を最適化する。 As a result, a combination in which the color-reduced image 2d is input data and the color source image after conversion is correct data is obtained as teacher data. Then, the image processing device 100 optimizes the first model parameter 1025 by training the first deep learning device 223 using the teacher data.
 ここで、画像処理装置100は、例えば、第2の深層学習器1026が上記モノクロ画像を完全に推論可能となるように上記最適化を進めてもよいし、推論可能となる前に、当該最適化を途中で意図的に止めてもよい。以下では、後者の場合を想定して説明を続ける。 Here, the image processing device 100 may proceed with the optimization so that the second deep learning device 1026 can completely infer the monochrome image, for example, or the optimization may proceed before the inference becomes possible. You may intentionally stop the conversion on the way. In the following, the explanation will be continued assuming the latter case.
 学習(上記第2のモデルパラメータ1028の最適化)が途中で止められた場合、第2の深層学習器1026は、入力画像1bの色成分が残るように減色された画像を生成する。ただし、見慣れた(普遍的に分布する)オブジェクトは、当該学習が浅くとも減色が進み、見慣れないオブジェクトや色にばらつきのあるオブジェクトは元の色成分が残る。 When learning (optimization of the second model parameter 1028) is stopped in the middle, the second deep learning device 1026 generates an image in which the color is reduced so that the color component of the input image 1b remains. However, familiar (universally distributed) objects undergo color reduction even if the learning is shallow, and unfamiliar objects and objects with variations in color retain their original color components.
 ここで、第2の深層学習器1026が減色しきれなかった(元の色成分が残った)オブジェクトは、第1の深層学習器223が色成分を付加することが難しいオブジェクトであることに注意する。第1のモデルパラメータ1025と第2のモデルパラメータ1028とは、同じカラー元画像から生成された教師データを用いて最適化されたパラメータだからである。 Note that the object for which the second deep learning device 1026 could not completely reduce the color (the original color component remains) is an object for which it is difficult for the first deep learning device 223 to add the color component. To do. This is because the first model parameter 1025 and the second model parameter 1028 are parameters optimized using the teacher data generated from the same color source image.
 言い換えれば、第2の深層学習器1026は、オブジェクトの分布に合わせて、彩色のための色成分の手掛かりを減色画像2dに残す。したがって、画像処理装置100は、第1のモデルパラメータ1025によって規定される第1の深層学習器223に、正しく彩色させることができる。すなわち、画像処理装置100は、人間が違和感を覚える色成分調整処理を実行する確率を下げ、当該色成分調整処理の汎用性をさらに高めることができる。 In other words, the second deep learning device 1026 leaves clues for color components for coloring in the color-reduced image 2d according to the distribution of objects. Therefore, the image processing device 100 can correctly color the first deep learning device 223 defined by the first model parameter 1025. That is, the image processing device 100 can reduce the probability of executing the color component adjusting process that makes humans feel uncomfortable, and further enhance the versatility of the color component adjusting process.
 <実施形態4>
 図11および図12に基づいて、第4の実施形態(実施形態4)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 4>
A fourth embodiment (Embodiment 4) will be described with reference to FIGS. 11 and 12. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (画像処理装置1100の全体構成)
 図11は、画像処理装置1100の要部構成の一例を示すブロック図である。図11に例示されるように、画像処理装置1100は、入力された入力画像1aを処理して出力画像1eを生成する装置であり、例えば、制御部1110を備える。
(Overall configuration of image processing device 1100)
FIG. 11 is a block diagram showing an example of the main configuration of the image processing apparatus 1100. As illustrated in FIG. 11, the image processing device 1100 is a device that processes the input input image 1a to generate the output image 1e, and includes, for example, a control unit 1110.
 制御部1110は、画像処理装置1100の各種機能を統括的に制御する機能を持ち、制御部110(図1参照)に含まれる色成分調整部105(図1参照)に代えて、色成分調整部1105を含む。また、制御部1110は、例えば、ユーザ調整量入力部1106をさらに含んでよい。 The control unit 1110 has a function of comprehensively controlling various functions of the image processing device 1100, and replaces the color component adjusting unit 105 (see FIG. 1) included in the control unit 110 (see FIG. 1) with a color component adjusting unit. Including part 1105. Further, the control unit 1110 may further include, for example, a user adjustment amount input unit 1106.
 ユーザ調整量入力部1106は、例えば、ユーザ調整量11aに関する入力を取得してよい。ここで、ユーザ調整量11aは、例えば、画像処理装置1100に入力可能に接続された入力装置(例えば、キーボード、画面上に表示されたリモコン、ボリュームコントローラなど)を介して入力された、色成分調整量を調整する量であってよい。ユーザ調整量11aは、例えば、0から2までの範囲に収まる実数であってもよいし、強(実数で2を表す)、中(1を表す)、弱(0を表す)の離散値であってもよい。 The user adjustment amount input unit 1106 may acquire, for example, an input related to the user adjustment amount 11a. Here, the user adjustment amount 11a is, for example, a color component input via an input device (for example, a keyboard, a remote controller displayed on the screen, a volume controller, etc.) connectable to the image processing device 1100 so as to be input. It may be an amount for adjusting the adjustment amount. The user adjustment amount 11a may be, for example, a real number within the range of 0 to 2, or a discrete value of strong (representing 2 in real number), medium (representing 1), and weak (representing 0). There may be.
 図12は、色成分調整部1105の要部構成の一例を示すブロック図である。色成分調整部1105は、色成分調整部105が備えた各部に加えて、例えば、色差情報調整部1224をさらに備えてよい。 FIG. 12 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1105. The color component adjusting unit 1105 may further include, for example, a color difference information adjusting unit 1224 in addition to each unit provided by the color component adjusting unit 105.
 色差情報調整部(色成分調整部)1224は、上記ユーザ調整量11aをさらに用いて、入力画像1bに含まれる複数の画素の色成分をそれぞれ調整し、出力画像1eを生成する。色差情報調整部1224は、例えば、色差情報算出部104から入力された色差情報1d(色差値)にユーザ調整量11aを乗じた値を、調整後の色差値として彩度調整部422に出力してよい。 The color difference information adjustment unit (color component adjustment unit) 1224 further uses the user adjustment amount 11a to adjust the color components of a plurality of pixels included in the input image 1b to generate the output image 1e. The color difference information adjustment unit 1224 outputs, for example, a value obtained by multiplying the color difference information 1d (color difference value) input from the color difference information calculation unit 104 by the user adjustment amount 11a to the saturation adjustment unit 422 as the adjusted color difference value. You can.
 上記構成により、画像処理装置1100は、ユーザに色成分を微調整させることができる。したがって、画像処理装置1100は、色成分調整処理の汎用性をさらに高めることができる。 With the above configuration, the image processing device 1100 allows the user to fine-tune the color component. Therefore, the image processing apparatus 1100 can further enhance the versatility of the color component adjusting process.
 <実施形態5>
 図1および図13に基づいて、第5の実施形態(実施形態5)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 5>
A fifth embodiment (Embodiment 5) will be described with reference to FIGS. 1 and 13. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (参照画像生成部1303の構成)
 画像処理装置100が備えた制御部110は、例えば、参照画像生成部103に代えて、参照画像生成部1303を含んでよい。
(Structure of reference image generation unit 1303)
The control unit 110 included in the image processing device 100 may include, for example, the reference image generation unit 1303 instead of the reference image generation unit 103.
 図13は、参照画像生成部1303の要部構成の一例を示すブロック図である。図13に例示されるように、参照画像生成部1303は、モデルパラメータ入れ替え部1329をさらに備える。モデルパラメータ入れ替え部1329は、例えば、第1のモデルパラメータ225から、当該第1のモデルパラメータ225とは異なるモデルパラメータ(例えば、モデルパラメータ1330、モデルパラメータ1331)に入れ替えてよい。 FIG. 13 is a block diagram showing an example of the configuration of the main part of the reference image generation unit 1303. As illustrated in FIG. 13, the reference image generation unit 1303 further includes a model parameter replacement unit 1329. The model parameter replacement unit 1329 may replace, for example, the first model parameter 225 with a model parameter (for example, model parameter 1330, model parameter 1331) different from the first model parameter 225.
 ユーザは、モデルパラメータを入れ替えるための指示・操作を画像処理装置100に入力すると、モデルパラメータ入れ替え部1329は、例えば、ネットワーク1308を介して画像処理装置100と通信可能に接続されたモデルパラメータ蓄積サーバ1309から上記異なるモデルパラメータを取得し、当該異なるモデルパラメータで上記第1のモデルパラメータ225を置換する。なお、当該異なるモデルパラメータは、第1のモデルパラメータ225との最適化に用いた教師データとは異なる教師データで最適化されたパラメータである。なお、当該異なるモデルパラメータを蓄積する方法は、モデルパラメータ蓄積サーバ1309のようなクラウドストレージに限定されず、例えば、ローカルディスクなどであってもよい。 When the user inputs an instruction / operation for exchanging the model parameters to the image processing device 100, the model parameter exchanging unit 1329 is, for example, a model parameter storage server communicably connected to the image processing device 100 via the network 1308. The different model parameters are obtained from 1309 and the first model parameters 225 are replaced with the different model parameters. The different model parameters are parameters optimized with teacher data different from the teacher data used for optimization with the first model parameter 225. The method of accumulating the different model parameters is not limited to cloud storage such as the model parameter storage server 1309, and may be, for example, a local disk.
 色彩に対する人間の嗜好は個人的な環境に依存するところ、参照画像生成部1303を備えた画像処理装置100は、第1のモデルパラメータ225から、多様な教師データ(例えば、所定の基準に則って分類された画像を正解データとして含む教師データ、所定の基準に合致する画像のみを正解データとして含む教師データなど)を用いて最適化されたモデルパラメータに入れ替えることにより、色成分調整の傾向をユーザの好みに合わせて変えることができる。 Where human preference for color depends on the personal environment, the image processing apparatus 100 with the reference image generator 1303 has a variety of teacher data (eg, according to predetermined criteria) from the first model parameter 225. By replacing the model parameters optimized using the teacher data that includes the classified images as correct answer data, the teacher data that includes only the images that meet the predetermined criteria as correct answer data, etc.), the user can change the tendency of color component adjustment. Can be changed to suit your taste.
 なお、参照画像生成部1303が減色部1021をさらに備える場合、モデルパラメータ入れ替え部1329は、例えば、第2のモデルパラメータ1028から、当該第2のモデルパラメータ1028とは異なるモデルパラメータに入れ替えてよい。 When the reference image generation unit 1303 further includes the color reduction unit 1021, the model parameter replacement unit 1329 may replace, for example, the second model parameter 1028 with a model parameter different from the second model parameter 1028.
 上記構成により、画像処理装置100は、ユーザの嗜好に合った色成分調整処理を実現できる。したがって、画像処理装置100は、色成分調整処理の汎用性をさらに高めることができる。 With the above configuration, the image processing device 100 can realize a color component adjusting process that suits the user's taste. Therefore, the image processing device 100 can further enhance the versatility of the color component adjusting process.
 <実施形態6>
 図1、図14~図15に基づいて、第6の実施形態(実施形態6)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 6>
A sixth embodiment (Embodiment 6) will be described with reference to FIGS. 1 and 14 to 15. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (色差情報算出部1404の構成)
 画像処理装置100が備えた制御部110は、例えば、色差情報算出部104に代えて、色差情報算出部1404を、色成分調整部105に代えて、色成分調整部1505を含んでよい。
(Structure of color difference information calculation unit 1404)
The control unit 110 provided in the image processing device 100 may include, for example, the color difference information calculation unit 1404 instead of the color difference information calculation unit 104, and the color component adjustment unit 1505 instead of the color component adjustment unit 105.
 図14は、色差情報算出部1404の要部構成の一例を示すブロック図である。図14に例示されるように、色差情報算出部1404は、色差算出部323に代えて、彩度差算出部1425を、色差正規化部324に代えて、彩度差正規化部1426を備えてよい。 FIG. 14 is a block diagram showing an example of the configuration of the main part of the color difference information calculation unit 1404. As illustrated in FIG. 14, the color difference information calculation unit 1404 includes a saturation difference calculation unit 1425 in place of the color difference calculation unit 323 and a saturation difference normalization unit 1426 in place of the color difference normalization unit 324. You can.
 彩度差算出部(色差情報算出部)1425は、例えば、入力画像1bに含まれる複数の画素に対してそれぞれ算出された第1彩度値と、参照画像1cに含まれる複数の画素に対してそれぞれ算出された第2彩度値との差を示す複数の彩度差値を計算してよい。ここで、上記複数の彩度差値は、正の値も負の値も取り得る。 The saturation difference calculation unit (color difference information calculation unit) 1425 is used, for example, for the first saturation value calculated for each of the plurality of pixels included in the input image 1b and for the plurality of pixels included in the reference image 1c. A plurality of saturation difference values indicating the difference from the second saturation value calculated in each case may be calculated. Here, the plurality of saturation difference values may take either a positive value or a negative value.
 彩度差正規化部(色差情報算出部)1426は、例えば、上記複数の彩度差値をそれぞれ正規化し、当該正規化された複数の彩度差値を色差情報1dの一部の情報として含めて出力してよい。例えば、彩度差正規化部1426は、彩度差値に任意の定数を乗じた値に基づいて、当該彩度差値を正規化してよい。 The saturation difference normalization unit (color difference information calculation unit) 1426, for example, normalizes the plurality of saturation difference values, and uses the normalized plurality of saturation difference values as a part of the color difference information 1d. It may be included and output. For example, the saturation difference normalization unit 1426 may normalize the saturation difference value based on a value obtained by multiplying the saturation difference value by an arbitrary constant.
 図15は、色成分調整部1505の要部構成の一例を示すブロック図である。図15に例示されるように、色成分調整部1505は、彩度調整部422に代えて、彩度調整部1522を備えてよい。 FIG. 15 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1505. As illustrated in FIG. 15, the color component adjusting unit 1505 may include a saturation adjusting unit 1522 instead of the saturation adjusting unit 422.
 彩度調整部(色成分調整部)1522は、例えば、変換後の画像に基づいて出力画像1eを生成してよい。具体的には、彩度調整部1522は、例えば、第3の色空間変換部421から入力された第2の画像(例えば、HSL色空間に変換された入力画像1b)の各画素について、色差情報算出部1404から入力した色差情報1dに応じて彩度成分を調整した第3の画像を生成してよい。 The saturation adjustment unit (color component adjustment unit) 1522 may generate an output image 1e based on, for example, the converted image. Specifically, the saturation adjustment unit 1522 has, for example, a color difference for each pixel of the second image (for example, the input image 1b converted to the HSL color space) input from the third color space conversion unit 421. A third image in which the saturation component is adjusted according to the color difference information 1d input from the information calculation unit 1404 may be generated.
 このとき、彩度調整部1522は、例えば、第3の色空間変換部421が出力した第2の画像の各画素について、色差情報1dに含まれる彩度差値に応じて彩度を調整してよい。具体的には、彩度調整部1522は、第2の画像に含まれる複数の画素にそれぞれ対応する彩度差値の正負に応じて、当該複数の画素の彩度をそれぞれ変更してよい(例えば、彩度差値が正の場合は当該彩度を大きくし、彩度差値が負の場合は当該彩度を小さくしてよい)。 At this time, the saturation adjustment unit 1522 adjusts the saturation of each pixel of the second image output by the third color space conversion unit 421 according to the saturation difference value included in the color difference information 1d. You can. Specifically, the saturation adjustment unit 1522 may change the saturation of the plurality of pixels according to the positive / negative of the saturation difference value corresponding to each of the plurality of pixels included in the second image ( For example, if the saturation difference value is positive, the saturation may be increased, and if the saturation difference value is negative, the saturation may be decreased).
 すなわち、色成分調整部1505は、入力画像1bに含まれる複数の画素の彩度をそれぞれ強めたり弱めたりすることによって彩度を調整し、上記第3の画像を生成する。これにより、画像処理装置100は、幅広い色成分調整処理を実現できるため、色成分調整処理の汎用性をさらに高めることができる。 That is, the color component adjusting unit 1505 adjusts the saturation by increasing or decreasing the saturation of the plurality of pixels included in the input image 1b, respectively, and generates the third image. As a result, the image processing apparatus 100 can realize a wide range of color component adjustment processing, so that the versatility of the color component adjustment processing can be further enhanced.
 <実施形態7>
 図1、図2、図4、図16、および図17に基づいて、第7の実施形態(実施形態7)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 7>
A seventh embodiment (Embodiment 7) will be described with reference to FIGS. 1, 2, 4, 16, and 17. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 (色差情報算出部1604の構成)
 画像処理装置100が備えた制御部110は、例えば、色差情報算出部104に代えて、色差情報算出部1604を、色成分調整部105に代えて、色成分調整部1705を含んでよい。
(Structure of color difference information calculation unit 1604)
The control unit 110 provided in the image processing device 100 may include, for example, the color difference information calculation unit 1604 instead of the color difference information calculation unit 104, and the color component adjustment unit 1705 instead of the color component adjustment unit 105.
 図16は、色差情報算出部1604の要部構成の一例を示すブロック図である。図16に例示されるように、色差情報算出部1604は、第1の色空間変換部321に代えて第4の色空間変換部1627を、第2の色空間変換部322に代えて第5の色空間変換部1628を、色差算出部323に代えて色相差算出部1629を、色差正規化部324に代えて色相差正規化部1630を備えてよい。 FIG. 16 is a block diagram showing an example of the configuration of the main part of the color difference information calculation unit 1604. As illustrated in FIG. 16, the color difference information calculation unit 1604 replaces the first color space conversion unit 321 with the fourth color space conversion unit 1627, and replaces the second color space conversion unit 322 with the fifth. The color space conversion unit 1628 may be provided with a hue difference calculation unit 1629 instead of the color difference calculation unit 323, and a hue difference normalization unit 1630 instead of the color difference normalization unit 324.
 第4の色空間変換部1627および第5の色空間変換部1628は、例えば、参照画像1cおよび入力画像1bを、色相を軸成分として有する色空間における画像にそれぞれ変換してよい。ここで、当該色空間は、例えば、HSL色空間、HSV色空間などであってよい。なお、当該変換は、目的に応じて適宜設定された線形変換・非線形変換であってよい。 The fourth color space conversion unit 1627 and the fifth color space conversion unit 1628 may, for example, convert the reference image 1c and the input image 1b into images in a color space having hue as an axis component, respectively. Here, the color space may be, for example, an HSL color space, an HSV color space, or the like. The transformation may be a linear transformation or a non-linear transformation appropriately set according to the purpose.
 色相差算出部1629は、例えば、入力画像1bに含まれる複数の画素に対してそれぞれ算出された第1色相値と、参照画像1cに含まれる複数の画素に対してそれぞれ算出された第2色相値との差を示す複数の色相差値を計算してよい。言い換えれば、色相差算出部1629は、参照画像1cと入力画像1bとの間で色相差を計算してよい。 The hue difference calculation unit 1629, for example, has a first hue value calculated for each of a plurality of pixels included in the input image 1b and a second hue calculated for each of the plurality of pixels included in the reference image 1c. A plurality of hue difference values indicating the difference from the values may be calculated. In other words, the hue difference calculation unit 1629 may calculate the hue difference between the reference image 1c and the input image 1b.
 色相差正規化部1630は、例えば、色相差算出部1629が計算した複数の色相差値を正規化し、当該正規化後の複数の色相差値を色差情報1dに含めて出力してよい。例えば、色相差正規化部1630は、色相差値が所定の範囲外の値である場合(例えば、6°より大きい場合)、色相差値をゼロに置き替えて出力してよい。 The hue difference normalization unit 1630 may, for example, normalize a plurality of hue difference values calculated by the hue difference calculation unit 1629, and include the plurality of normalized hue difference values in the color difference information 1d for output. For example, when the hue difference normalization unit 1630 is a value outside the predetermined range (for example, when it is larger than 6 °), the hue difference value may be replaced with zero and output.
 すなわち、ある画素の色相差値が所定の範囲外であれば、深層学習器223による彩色の信頼性は低いと解釈し、画像処理装置100は、当該彩色を色相調整に使用しない。これにより、画像処理装置100は、色成分調整処理の汎用性をさらに高めることができる。 That is, if the hue difference value of a certain pixel is out of the predetermined range, it is interpreted that the reliability of coloring by the deep learning device 223 is low, and the image processing apparatus 100 does not use the coloring for hue adjustment. As a result, the image processing apparatus 100 can further enhance the versatility of the color component adjusting process.
 図17は、色成分調整部1705の要部構成の一例を示すブロック図である。図17に例示されるように、色成分調整部1705は、例えば、第6の色空間変換部1721、色相調整部1724、および色空間逆変換部1723を備えてよい。 FIG. 17 is a block diagram showing an example of the configuration of the main part of the color component adjusting unit 1705. As illustrated in FIG. 17, the color component adjusting unit 1705 may include, for example, a sixth color space conversion unit 1721, a hue adjusting unit 1724, and a color space inverse conversion unit 1723.
 第6の色空間変換部(色成分調整部)1721は、例えば、色相を軸成分として有する色空間における画像に入力画像1bを変換することにより、第2の画像を生成してよい。ここで、当該色空間は、例えば、HSL色空間、HSV色空間などであってよい。 The sixth color space conversion unit (color component adjustment unit) 1721 may generate a second image by, for example, converting the input image 1b into an image in a color space having a hue as an axis component. Here, the color space may be, for example, an HSL color space, an HSV color space, or the like.
 色相調整部(色成分調整部)1724は、例えば、色差情報算出部1604から入力された複数の色相差値に応じて、入力画像1bに含まれる複数の画素の色相をそれぞれ変更してよい。具体的には、第6の色空間変換部1721が入力画像1bを変換した場合、色相調整部1724は、当該変換後の画像に基づいて出力画像1eを生成してよい。このとき、色相調整部1724は、例えば、当該変換後の画像に含まれる複数の画素の色相成分を、上記色差情報1dに応じてそれぞれ調整した画像を生成してもよい。 The hue adjustment unit (color component adjustment unit) 1724 may change the hues of the plurality of pixels included in the input image 1b according to the plurality of hue difference values input from the hue difference information calculation unit 1604, for example. Specifically, when the sixth color space conversion unit 1721 converts the input image 1b, the hue adjustment unit 1724 may generate the output image 1e based on the converted image. At this time, the hue adjusting unit 1724 may generate an image in which the hue components of a plurality of pixels included in the converted image are adjusted according to the color difference information 1d, for example.
 色空間逆変換部1723は、例えば、色相調整部1724によって生成された画像を、入力画像1bの色空間に逆変換してよい。 The color space inverse conversion unit 1723 may, for example, inversely convert the image generated by the hue adjustment unit 1724 into the color space of the input image 1b.
 普遍的に分布するオブジェクトに対応する画素に対して計算された色相差は、そうでないオブジェクト(普遍的に分布しないオブジェクト、色彩のばらつきが大きいオブジェクトなど)の色相差よりも小さい。色成分調整部1705は、当該色相差に基づいて色成分(具体的には、色相)を調整するため、前者の色成分の調整量を小さくし、後者を大きくする。 The calculated hue difference for the pixels corresponding to the universally distributed object is smaller than the hue difference of other objects (objects that are not universally distributed, objects with large color variation, etc.). In order to adjust the color component (specifically, the hue) based on the hue difference, the color component adjusting unit 1705 reduces the adjustment amount of the former color component and increases the latter.
 これにより、画像処理装置100は、画像内のあらかじめ想定したオブジェクトに対してしか色彩調整が行えない従来の画像処理装置より、汎用性の高い色成分調整を実現できる。 As a result, the image processing device 100 can realize more versatile color component adjustment than the conventional image processing device that can adjust the color only for the object assumed in advance in the image.
 <実施形態8>
 図18に基づいて、第8の実施形態(実施形態8)を説明する。なお、各図面において、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。
<Embodiment 8>
An eighth embodiment (Embodiment 8) will be described with reference to FIG. In each drawing, the same or equivalent elements are designated by the same reference numerals, and duplicate description will be omitted.
 図18は、各実施形態に係る画像処理装置を備えた画像表示装置1801を模式的に示す模式図である。画像表示装置1801は、任意の画像を表示可能な装置でありさえすれば何でもよく、例えば、液晶ディスプレイ、有機ELディスプレイなどであってよい。画像表示装置1801は、当該画像処理装置が出力する出力画像1eを表示面1802に表示し、当該出力画像1eをユーザに提示する。 FIG. 18 is a schematic view schematically showing an image display device 1801 provided with an image processing device according to each embodiment. The image display device 1801 may be any device as long as it can display an arbitrary image, and may be, for example, a liquid crystal display, an organic EL display, or the like. The image display device 1801 displays the output image 1e output by the image processing device on the display surface 1802, and presents the output image 1e to the user.
 <ソフトウェアによる実現例>
 本開示の各態様に係る画像処理装置が備えた制御ブロックは、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)などのコントローラ(プロセッサ)を用いてソフトウェアによって実現してもよい。すなわち、各画像処理装置は、各機能を実現するソフトウェアである制御プログラムの命令を実行するCPU、当該制御プログラム、および各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、当該制御プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記制御プログラムを上記記録媒体から読み取って実行することにより、本開示の各態様に係る目的の一例が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記制御プログラムは、当該制御プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。本開示の各態様は、上記制御プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。
<Example of realization by software>
The control block provided in the image processing device according to each aspect of the present disclosure may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be realized by a CPU (Central Processing Unit) or the like. It may be realized by software using a controller (processor). That is, each image processing device is a CPU that executes instructions of a control program that is software that realizes each function, the control program, and a ROM (Read Only Memory) in which various data are readablely recorded by a computer (or CPU). ) Or a storage device (these are referred to as "recording media"), a RAM (Random Access Memory) for deploying the control program, and the like. Then, when the computer (or CPU) reads the control program from the recording medium and executes it, an example of the object according to each aspect of the present disclosure is achieved. As the recording medium, a "non-temporary tangible medium", for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. Further, the control program may be supplied to the computer via an arbitrary transmission medium (communication network, broadcast wave, etc.) capable of transmitting the control program. Each aspect of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave, in which the control program is embodied by electronic transmission.
 なお、上記制御プログラムは、任意のプログラミング言語で実装できる。例えば、当該制御プログラムは、ActionScript、JavaScript(登録商標)などのスクリプト言語、Objective-C、Java(登録商標)などのオブジェクト指向プログラミング言語、HTML5などのマークアップ言語などを用いて実装できる。また、当該制御プログラムによって実現される各機能を実現する各部を備えた情報処理端末(例えば、スマートフォン、パーソナルコンピュータ)と、上記各機能とは異なる残りの機能を実現する各部を備えたサーバも、本開示の範疇に入る。 The above control program can be implemented in any programming language. For example, the control program can be implemented using a script language such as ActionScript or JavaScript (registered trademark), an object-oriented programming language such as Objective-C or Java (registered trademark), or a markup language such as HTML5. In addition, an information processing terminal (for example, a smartphone or a personal computer) having each part that realizes each function realized by the control program and a server having each part that realizes the remaining functions different from the above-mentioned functions are also available. It falls within the scope of this disclosure.
 本開示は上述した実施の形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施の形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施の形態についても、本開示の技術的範囲に含まれる。さらに、各実施の形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成できる。 The present disclosure is not limited to the above-described embodiment, and various modifications can be made within the scope of the claims, and the embodiment obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present disclosure. Furthermore, new technical features can be formed by combining the technical means disclosed in each embodiment.

Claims (20)

  1.  入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成部と、
     前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出部と、
     前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整部とを備えた画像処理装置。
    A reference image generation unit that generates a reference image in which the color components of at least some of the pixels included in the input image are changed based on the input image.
    A color difference information calculation unit that calculates color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image.
    An image processing device including a color component adjusting unit that generates an output image in which the color components of the plurality of pixels included in the input image are adjusted by using the color difference information.
  2.  前記参照画像生成部は、前記入力画像に含まれる前記複数の画素の色成分をそれぞれ減少させた減色画像を生成した後、当該減色画像に含まれる複数の画素に色成分をそれぞれ付加した彩色画像を、前記参照画像として生成することを特徴とする請求項1に記載の画像処理装置。 The reference image generation unit generates a color-reduced image in which the color components of the plurality of pixels included in the input image are reduced, and then adds a color component to each of the plurality of pixels included in the color-reduced image. The image processing apparatus according to claim 1, wherein the image is generated as the reference image.
  3.  前記参照画像生成部は、第1のモデルパラメータによって規定される第1の学習済みモデルを用いて、前記減色画像に色成分を付加することを特徴とする請求項2に記載の画像処理装置。 The image processing device according to claim 2, wherein the reference image generation unit uses a first trained model defined by a first model parameter to add a color component to the color-reduced image.
  4.  前記第1のモデルパラメータから、当該第1のモデルパラメータとは異なるモデルパラメータに入れ替えるモデルパラメータ入れ替え部をさらに備えたことを特徴とする請求項3に記載の画像処理装置。 The image processing apparatus according to claim 3, further comprising a model parameter replacement unit that replaces the first model parameter with a model parameter different from the first model parameter.
  5.  前記参照画像生成部は、前記入力画像に含まれる前記複数の画素のうち、色成分を有する画素の当該色成分の値をゼロ以上に設定した画像を、前記減色画像として生成することを特徴とする請求項2から4のいずれか一項に記載の画像処理装置。 The reference image generation unit is characterized in that, among the plurality of pixels included in the input image, an image in which the value of the color component of the pixel having a color component is set to zero or more is generated as the color reduction image. The image processing apparatus according to any one of claims 2 to 4.
  6.  前記参照画像生成部は、第2のモデルパラメータによって規定される第2の学習済みモデルを用いて、前記減色画像を生成することを特徴とする請求項2から5のいずれか一項に記載の画像処理装置。 The reference image generation unit according to any one of claims 2 to 5, wherein the reference image generation unit generates the color-reduced image by using the second trained model defined by the second model parameter. Image processing device.
  7.  前記色差情報算出部は、前記入力画像に含まれる前記複数の画素の画素値と、当該複数の画素にそれぞれ対応し、前記参照画像に含まれる前記複数の画素の画素値との距離を所定の色空間においてそれぞれ算出し、当該距離を前記色差情報の一部の情報として含めることを特徴とする請求項1から6のいずれか一項に記載の画像処理装置。 The color difference information calculation unit determines the distance between the pixel values of the plurality of pixels included in the input image and the pixel values of the plurality of pixels included in the reference image corresponding to the plurality of pixels. The image processing apparatus according to any one of claims 1 to 6, wherein the distance is calculated in each color space and the distance is included as a part of the color difference information.
  8.  前記色成分調整部は、前記入力画像に含まれる前記複数の画素にそれぞれ対応する前記距離が大きいほど、当該複数の画素の彩度をそれぞれ大きくすることを特徴とする請求項7に記載の画像処理装置。 The image according to claim 7, wherein the color component adjusting unit increases the saturation of the plurality of pixels as the distance corresponding to each of the plurality of pixels included in the input image increases. Processing equipment.
  9.  前記色差情報算出部は、前記入力画像に含まれる前記複数の画素に対してそれぞれ算出された第1彩度値と、前記参照画像に含まれる前記複数の画素に対してそれぞれ算出された第2彩度値との差を示す複数の彩度差値を、前記色差情報の一部の情報として含めることを特徴とする請求項1から8のいずれか一項に記載の画像処理装置。 The color difference information calculation unit has a first saturation value calculated for each of the plurality of pixels included in the input image and a second saturation value calculated for each of the plurality of pixels included in the reference image. The image processing apparatus according to any one of claims 1 to 8, wherein a plurality of saturation difference values indicating a difference from the saturation value are included as a part of the color difference information.
  10.  前記色成分調整部は、前記複数の彩度差値に応じて、前記入力画像に含まれる前記複数の画素の彩度をそれぞれ変更することを特徴とする請求項9に記載の画像処理装置。 The image processing apparatus according to claim 9, wherein the color component adjusting unit changes the saturation of the plurality of pixels included in the input image according to the plurality of saturation difference values.
  11.  前記色成分調整部は、彩度を軸成分とする色空間における画像に前記入力画像を変換し、当該変換後の画像に基づいて前記出力画像を生成することを特徴とする請求項1から10のいずれか一項に記載の画像処理装置。 Claims 1 to 10 are characterized in that the color component adjusting unit converts the input image into an image in a color space having saturation as an axis component, and generates the output image based on the converted image. The image processing apparatus according to any one of the above.
  12.  前記色成分調整部は、前記変換後の画像に含まれる複数の画素の彩度成分を、前記色差情報に応じてそれぞれ調整した画像を生成し、当該生成した画像を前記入力画像の色空間に逆変換することによって、前記出力画像を生成することを特徴とする請求項11に記載の画像処理装置。 The color component adjusting unit generates an image in which the saturation components of a plurality of pixels included in the converted image are adjusted according to the color difference information, and the generated image is used in the color space of the input image. The image processing apparatus according to claim 11, wherein the output image is generated by inverse conversion.
  13.  前記色差情報算出部は、前記入力画像に含まれる前記複数の画素に対してそれぞれ算出された第1色相値と、前記参照画像に含まれる前記複数の画素に対してそれぞれ算出された第2色相値との差を示す複数の色相差値を、前記色差情報の一部の情報として含めることを特徴とする請求項1から12のいずれか一項に記載の画像処理装置。 The color difference information calculation unit has a first hue value calculated for each of the plurality of pixels included in the input image and a second hue calculated for each of the plurality of pixels included in the reference image. The image processing apparatus according to any one of claims 1 to 12, wherein a plurality of hue difference values indicating a difference from the values are included as a part of the color difference information.
  14.  前記色成分調整部は、前記複数の色相差値に応じて、前記入力画像に含まれる前記複数の画素の色相をそれぞれ変更することを特徴とする請求項13に記載の画像処理装置。 The image processing apparatus according to claim 13, wherein the color component adjusting unit changes the hues of the plurality of pixels included in the input image according to the plurality of hue difference values.
  15.  前記色成分調整部は、色相を軸成分とする色空間における画像に前記入力画像を変換し、当該変換後の画像に基づいて前記出力画像を生成することを特徴とする請求項1から14のいずれか一項に記載の画像処理装置。 The color component adjusting unit according to claims 1 to 14, wherein the color component adjusting unit converts the input image into an image in a color space having a hue as an axis component, and generates the output image based on the converted image. The image processing apparatus according to any one item.
  16.  前記色成分調整部は、前記変換後の画像に含まれる複数の画素の色相成分を、前記色差情報に応じてそれぞれ調整した画像を生成し、当該生成した画像を前記入力画像の色空間に逆変換することによって、前記出力画像を生成することを特徴とする請求項15に記載の画像処理装置。 The color component adjusting unit generates an image in which the hue components of a plurality of pixels included in the converted image are adjusted according to the color difference information, and reverses the generated image to the color space of the input image. The image processing apparatus according to claim 15, wherein the output image is generated by conversion.
  17.  ユーザ調整量に関する入力を取得するユーザ調整量入力部をさらに備え、
     前記色成分調整部は、前記ユーザ調整量をさらに用いて、前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整し、前記出力画像を生成することを特徴とする請求項1から16のいずれか一項に記載の画像処理装置。
    Further provided with a user adjustment amount input unit for acquiring input related to the user adjustment amount,
    Claims 1 to 16 are characterized in that the color component adjusting unit further adjusts the color components of the plurality of pixels included in the input image by further using the user adjustment amount to generate the output image. The image processing apparatus according to any one of the above.
  18.  請求項1から17のいずれか一項に記載の画像処理装置を備えた画像表示装置。 An image display device including the image processing device according to any one of claims 1 to 17.
  19.  入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成ステップと、
     前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出ステップと、
     前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整ステップとを含む画像処理装置の制御方法。
    A reference image generation step of generating a reference image in which the color components of at least some of the pixels included in the input image are changed based on the input image, and
    A color difference information calculation step for calculating color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image.
    A control method of an image processing apparatus including a color component adjusting step of generating an output image in which the color components of the plurality of pixels included in the input image are adjusted by using the color difference information.
  20.  画像処理装置に、
     入力画像に含まれる複数の画素のうち、少なくとも一部の画素の色成分を変更した参照画像を、当該入力画像に基づいて生成する参照画像生成機能と、
     前記入力画像に含まれる前記複数の画素と、前記参照画像に含まれる複数の画素とのそれぞれの色差に基づく色差情報を算出する色差情報算出機能と、
     前記入力画像に含まれる前記複数の画素の色成分をそれぞれ調整した出力画像を、前記色差情報を用いて生成する色成分調整機能とを実現させる制御プログラム。
    For image processing equipment
    A reference image generation function that generates a reference image in which the color components of at least some of the pixels included in the input image are changed based on the input image.
    A color difference information calculation function that calculates color difference information based on the color difference between the plurality of pixels included in the input image and the plurality of pixels included in the reference image.
    A control program that realizes a color component adjusting function that generates an output image in which the color components of the plurality of pixels included in the input image are adjusted by using the color difference information.
PCT/JP2020/018560 2019-07-19 2020-05-07 Image processing device, image display device, image processing method, and program WO2021014714A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-133404 2019-07-19
JP2019133404 2019-07-19

Publications (1)

Publication Number Publication Date
WO2021014714A1 true WO2021014714A1 (en) 2021-01-28

Family

ID=74194111

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/018560 WO2021014714A1 (en) 2019-07-19 2020-05-07 Image processing device, image display device, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2021014714A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005316743A (en) * 2004-04-28 2005-11-10 Toshiba Corp Image processing method and device
JP2010232773A (en) * 2009-03-26 2010-10-14 Seiko Epson Corp Image display apparatus
JP5454711B1 (en) * 2013-01-31 2014-03-26 富士ゼロックス株式会社 Image processing apparatus, color adjustment system, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005316743A (en) * 2004-04-28 2005-11-10 Toshiba Corp Image processing method and device
JP2010232773A (en) * 2009-03-26 2010-10-14 Seiko Epson Corp Image display apparatus
JP5454711B1 (en) * 2013-01-31 2014-03-26 富士ゼロックス株式会社 Image processing apparatus, color adjustment system, and program

Similar Documents

Publication Publication Date Title
CN110447051B (en) Perceptually preserving contrast and chroma of a reference scene
KR101700362B1 (en) Image processing method and image processing apparatus
US9396526B2 (en) Method for improving image quality
JP2005269639A (en) Method and system for converting color image to gray scale image, gray scale image, method for improving conversion from color to gray scale, and edge emphasizing method
US20200193249A1 (en) Learning device, print control device, and learned model
JP2007215216A (en) Monotonization processing of color image
JP2010062672A (en) Image processor and method thereof
KR20240001264A (en) Low cost color expansion module for expanding colors of an image
CN102339461A (en) Method and equipment for enhancing image
US10600170B2 (en) Method and device for producing a digital image
US8345967B2 (en) Apparatus and method of processing image, and record medium for the method
Eilertsen The high dynamic range imaging pipeline
US20130287299A1 (en) Image processing apparatus
JP2005064789A (en) Monotonization processing of color image
CN116668656B (en) Image processing method and electronic equipment
US8164650B2 (en) Image processing apparatus and method thereof
WO2021014714A1 (en) Image processing device, image display device, image processing method, and program
WO2013179937A1 (en) Image processing method, image processing device, and image processing program
CN113066020A (en) Image processing method and device, computer readable medium and electronic device
CN111292251B (en) Image color cast correction method, device and computer storage medium
US20170244972A1 (en) Methods and apparatus for mapping input image
US11388348B2 (en) Systems and methods for dynamic range compression in multi-frame processing
WO2022194345A1 (en) Modular and learnable image signal processor
EP4042405A1 (en) Perceptually improved color display in image sequences on physical displays
JP4848903B2 (en) Color processing apparatus, color processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20844304

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20844304

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP