WO2017138446A1 - Appareil de traitement d'image, procédé de traitement d'image et support d'informations - Google Patents

Appareil de traitement d'image, procédé de traitement d'image et support d'informations Download PDF

Info

Publication number
WO2017138446A1
WO2017138446A1 PCT/JP2017/003923 JP2017003923W WO2017138446A1 WO 2017138446 A1 WO2017138446 A1 WO 2017138446A1 JP 2017003923 W JP2017003923 W JP 2017003923W WO 2017138446 A1 WO2017138446 A1 WO 2017138446A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
half tone
pixels
input image
observation direction
Prior art date
Application number
PCT/JP2017/003923
Other languages
English (en)
Inventor
Jun Hirabayashi
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2017138446A1 publication Critical patent/WO2017138446A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/52Circuits or arrangements for halftone screening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/405Halftoning, i.e. converting the picture signal of a continuous-tone original into a corresponding signal showing only two levels

Definitions

  • the present invention relates to a technique to reproduce a variable reflection characteristic by which the appearance of a printed matter changes depending on the observation direction.
  • Non-Patent Literature 1 discloses a technique to reproduce a different BRDF depending on each region of an output target surface (bidirectional reflectivity distribution function). Specifically, a different color can be visually recognized depending on the observation direction by subjecting a substantially-plane surface having a cyclic and minute concave face structure to the image formation (coloring output) depending on a cyclic concave face shape.
  • Fig. 1A to Fig. 1C illustrate a technique to output the coloring onto a substantially-plane surface having a cyclic and regular minute concave face structure.
  • Fig. 2A to Fig. 2C illustrate how the coloring outputted to an example in which a substantially-plane surface having a cyclic and regular minute concave face structure has a different appearance depending on the observation direction.
  • the coloring face (Fig. 1C) is configured so that a coloring pattern outputted to the cyclic concave face shape has a visually-recognized area ratio that is different depending on the inclinations of the respective regions of the cyclic concave face shape.
  • visually-recognized color tones of the respective regions of the coloring face can be changed depending on the observation direction.
  • Fig. 3A to Fig. 3B are a conceptual diagram illustrating a basic zone of a minute concave face structure and a coloring pattern depending on the observation direction outputted to the surface.
  • Fig. 3A is an enlarged view illustrating the basic zone of the minute structure. The basic zone is based on a region surrounded by the dotted line as a unit.
  • Fig. 3B illustrates examples of coloring patterns to fill the basic zone of the minute concave face structure in a case where the observation direction is divided by 25 at a polar coordinate ( ⁇ , ⁇ ).
  • the basic zones of Fig. 3A are colored by the coloring patterns as shown in Fig. 3B.
  • the individual regions in Fig. 3B are colored with the individual colors to correspond to the observation direction (polar coordinate system: ⁇ , ⁇ ).
  • a part corresponding to the minute concave face structure is easily visually recognized in the observation direction (the polar coordinate system: ⁇ , ⁇ ) (or the visually-recognized area is increased).
  • the individual color tones are expressed for each observation direction (polar coordinate system: ⁇ , ⁇ ), thus providing a coloring output as shown in Fig. 2A to Fig. 2C.
  • the observation direction means a normal direction of each individual region.
  • Fig. 4 illustrates an example of the structure of the input image data describing the variable angle reflection color characteristic. For simplicity, an example of the variable angle reflection color characteristic consisting of a black monochronic gradation is shown.
  • Reference character (a) of Fig. 4 illustrates the input image data describing the variable angle reflection color characteristic.
  • FIG. 4 illustrates the variable angle reflection color characteristic in 5 ⁇ 5 pixels showing the coloring pattern corresponding to the basic zones of a minute concave face structure.
  • the input image data as shown in Fig. 4 is received. Then, such a coloring output is performed on the output target surface consisting of a cyclic and minute concave face structure that reproduces a different color depending on the observation direction.
  • Fig. 5A to Fig. 5D illustrate the above-described disadvantage.
  • An example will be described in which, with regard to the basic zone of the minute concave face structure shown in Fig. 5A, "the individual color tone for each observation direction (polar coordinate system: ⁇ , ⁇ )" shown in Fig. 5B was inputted and a printer that can provide a binary output was used to subject a minute concave face structure surface to a coloring output.
  • Fig. 5C illustrates a case where a half tone processing was not used, specifically, a case where a 2 gradation output was performed. In this case, the gradation is lost, undesirably causing such a color tone that is totally different from what should have been originally outputted.
  • Fig. 5C illustrates a case where a half tone processing was not used, specifically, a case where a 2 gradation output was performed. In this case, the gradation is lost, undesirably causing such a color tone that is totally different from what should
  • FIG. 5D illustrates a case where a multi-gradation output was performed by a general half tone processing using a neighboring pixels group (the error diffusion method in this case).
  • the color wraparound in other observation directions is undesirably caused, thus causing a coloring output having a characteristic totally different from an individual color tone input that should be originally outputted for each observation direction (polar coordinate system: ⁇ , ⁇ ).
  • An image processing apparatus has an acquisition unit for acquiring input image data for subjecting a surface consisting of a cyclic and minute three-dimensional structure to an image formation, the input image data describing colors of the respective pixels to be associated with information for observation directions corresponding to the pixels, and a half tone processing unit for subjecting the input image data to half tone processings in the respective observation directions so that pixels neighboring to one another in the observation direction are assumed as neighboring pixels to thereby generate image data used for the image formation.
  • a coloring output is performed to reproduce, on an output target surface consisting of a cyclic and minute three-dimensional structure, a different color depending on an observation direction, a high gradation number and favorable color reproducibility can be realized while preventing the color wraparound in other observation directions.
  • Fig. 1A illustrates a technique to provide a coloring output on a substantially-plane surface having a cyclic and regular minute concave face structure
  • Fig. 1B illustrates a technique to provide a coloring output on a substantially-plane surface having a cyclic and regular minute concave face structure
  • Fig. 1C illustrates a technique to provide a coloring output on a substantially-plane surface having a cyclic and regular minute concave face structure
  • Fig. 2A illustrates an example in which a coloring output provided on a substantially-plane surface having a cyclic and regular minute concave face structure is differently visually recognized depending on the observation direction
  • Fig. 1A illustrates a technique to provide a coloring output on a substantially-plane surface having a cyclic and regular minute concave face structure
  • FIG. 2B illustrates an example in which a coloring output provided on a substantially-plane surface having a cyclic and regular minute concave face structure is differently visually recognized depending on the observation direction
  • Fig. 2C illustrates an example in which a coloring output provided on a substantially-plane surface having a cyclic and regular minute concave face structure is differently visually recognized depending on the observation direction
  • Fig. 3A is a conceptual diagram illustrating a basic zone of a minute concave face structure
  • Fig. 3B is a conceptual diagram illustrating a coloring pattern depending on the observation direction that is outputted on a surface of a basic zone of a minute concave face structure
  • Fig. 4 illustrates an example of the structure of input image data describing a variable angle reflection color characteristic
  • FIG. 5A is a conceptual diagram illustrating that a reduced gradation number or the color wraparound in a different viewing field direction is caused in the prior art technique
  • Fig. 5B is a conceptual diagram illustrating that a reduced gradation number or the color wraparound in a different viewing field direction is caused in the prior art technique
  • Fig. 5C is a conceptual diagram illustrating that a reduced gradation number or the color wraparound in a different viewing field direction is caused in the prior art technique
  • Fig. 5D is a conceptual diagram illustrating that a reduced gradation number or the color wraparound in a different viewing field direction is caused in the prior art technique
  • Fig. 6 is a block diagram illustrating the configuration of the image formation system according to Embodiment 1; Fig.
  • FIG. 7 is a flowchart illustrating the outline of the operation of the image formation system according to Embodiment 1;
  • Fig. 8A illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic;
  • Fig. 8B illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic;
  • Fig. 8C illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic;
  • Fig. 8A illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic;
  • Fig. 8B illustrates a processing process example for generating
  • FIG. 8D illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic
  • Fig. 8E illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic
  • Fig. 8F illustrates a processing process example for generating, in Embodiment 1, the CMYK output integrated image data out of the input image data (RGB) describing the variable angle reflection color characteristic
  • Fig. 9 is a processing example for integrating, in Embodiment 1, the each observation direction image data to generate integrated image data
  • Fig. 9 is a processing example for integrating, in Embodiment 1, the each observation direction image data to generate integrated image data
  • Fig. 9 is a processing example for integrating, in Embodiment 1, the each observation direction image data to generate integrated image data
  • Fig. 9 is
  • FIG. 10A is a schematic view illustrating the output target surface used in Embodiment 1 in an expanded manner
  • Fig. 10B is a schematic view illustrating the output target surface used in Embodiment 1 in an expanded manner
  • Fig. 11 illustrates a 2 ⁇ 2 area gradation mask used in a half tone processing component of Embodiment 1
  • Fig. 12A illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12B illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12A illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12B illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB
  • FIG. 12C illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12D illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12E illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12F illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12C illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • Fig. 12E illustrates a processing process example in a case where the area gradation mask of the half tone processing in Embodiment 1 is different depending on RGB color
  • FIG. 13A illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 13B illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 13C illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 13A illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 13C illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color
  • FIG. 13D illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 13E illustrates a processing process example as a comparison example in Embodiment 1 in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing
  • Fig. 14A illustrates that a desired coloring output cannot be achieved in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing as a comparison example in Embodiment 1
  • Fig. 14B illustrates that a desired coloring output cannot be achieved in a case where a half tone processing was performed in a CMYK color space as a subtractive color mixing as a comparison example in Embodiment 1.
  • An image formation system subjects an output target surface consisting of a cyclic and regular minute three-dimensional structure to a coloring output corresponding to a cyclic three-dimensional shape to thereby change, on each position on the output target surface, the color depending on the observation direction.
  • Fig. 6 is a block diagram illustrating the configuration of the image formation system according to this embodiment. As shown in Fig. 6, the image formation system includes an image processing apparatus 100 and an image formation apparatus 200.
  • the image processing apparatus 100 is a computer for example and includes, as a hardware configuration, a CPU, a ROM, an HDD, a RAM, a general-purpose interface(I/F), a serial ATA (SATA) I/F, and a video card (VC).
  • the respective components are connected via a system bus.
  • the CPU uses the RAM as a work memory to execute an operating system (OS) or various programs stored in the ROM or HDD, thereby controlling the entire image formation system.
  • Programs executed by the CPU include a program such as an image processing (which will be described later) for example.
  • the general-purpose I/F is a serial bus interface such as an USB and is connected to an input device such as a mouse or a keyboard and an output device (e.g., the image formation apparatus 200).
  • the SATA I/F is connected to a general-purpose drive for performing reading and writing operations to the HDD or various storage media.
  • the CPU uses the HDD or various storage media mounted on the general-purpose drive as a place where data is stored and where reading and writing operations are performed.
  • the VC is a video interface connected to a display. The CPU allows the display to display a user interface (UI) screen provided by the program and receives a user input including a user instruction via an input device.
  • UI user interface
  • the image processing apparatus 100 has, as a logic configuration, an input image data acquisition component 101, an input image data storage component 102, an each observation direction image data generation component 103, and each observation direction image data storage components 104 to 106.
  • the image processing apparatus 100 further has half tone processing components 107 to 109, after-half-tone-processing each observation direction image data storage components 110 to 112, integrated image data generation components 113 to 115, integrated image data storage components 116 to 118, a color conversion component 119, and output integrated image data storage components 120 to 123. Details of the respective processing components will be described with reference to the flowchart of Fig. 7. In this embodiment, these processing components are configured in the image processing apparatus 100 but also may be configured in the image formation apparatus 200.
  • the image formation apparatus 200 reads output integrated image data from the output integrated image data storage components 120 to 123 and subjects an output target surface to a coloring by recording material such as ink and realizes a variable angle reflection color characteristic depending on the minute three-dimensional structure of the output target surface.
  • the image formation apparatus 200 uses an inkjet printer including UV curing ink and includes four color inks of yellow (Y), magenta (M), cyan (C), and black (K). Each color can be subjected to a binary output.
  • inks are not limited to them.
  • the image formation apparatus 200 is also not limited to an inkjet printer including UV curing ink and may be any one that can provide a coloring output to an output target surface consisting of a minute three-dimensional structure.
  • the image formation apparatus 200 may be an inkjet printer using solvent ink or may be a printer using a printing method other than the inkjet method.
  • FIG. 10A is a schematic view illustrating the output target surface used in this embodiment.
  • the output target surface is configured so that a spherical shape having a concave face (a minute three-dimensional structure) is repeatedly formed.
  • the minute three-dimensional structure is not limited to the spherical shape having a concave face and may be any shape so long as the shape can selectively change the visually-recognized area of the shape surface depending on the observation direction.
  • the concave face may be substituted with a convex face structure.
  • a concave face shape and a convex face shape may be alternately arranged.
  • Fig. 7 is a flowchart illustrating the outline of the operation of the image formation system according to this embodiment.
  • a processing process example in a case where the image data (RGB) describing the variable angle reflection color characteristic is inputted is shown in Fig. 8A to Fig. 8F.
  • Step S201 the input image data acquisition component 101 receives an output command from a not-shown computer.
  • Step S202 the input image data acquisition component 101 acquires input image data defining the observation directions at the respective positions on the output target surface and the presented colors (the input image data describing the variable angle reflection color characteristic) and stores the data in the input image data storage component 102.
  • Fig. 8A illustrates the input image data (RGB) describing the variable angle reflection color characteristic.
  • a region surrounded by the heavy line in Fig. 8A shows "a coloring pattern for coloring the basic zone of the minute three-dimensional structure".
  • the input image data of Fig. 8A is not monochronic image data but RGB image data.
  • the observation direction of the basic zone shown in Fig. 10B is divided by 25 at polar coordinates ( ⁇ , ⁇ ). Then, the normal direction of the respective individual regions of the basic zone is set as an observation direction corresponding to the respective individual regions.
  • the variable angle reflection color characteristic includes the presented colors at the respective positions of the output target surface in which basic zones are formed cyclically (RGB values of the respective pixels) and the information for the observation directions (information for the observation directions corresponding to the pixels).
  • RGB values of the respective pixels the information for the observation directions
  • a part shown with a light color shows the one having a high RGB value while a part shown with a deep color shows the one having a low RGB value.
  • Step S203 the each observation direction image data generation component 103 reads the input image data describing the variable angle reflection color characteristic from the input image data storage component 102 to thereby generate each observation direction image data (RGB). These pieces of image data are stored, for the respective channels for RGB, in an each observation direction image data storage component 104(R), an each observation direction image data storage component 105(G), and an each observation direction image data storage component 106(B).
  • the each observation direction image data is obtained by setting the ones having the same observation direction as shown in Fig. 8B as "two-dimensional planar single image data". Neighboring pixels in the each observation direction image data are not neighboring pixels on the layout on the output target surface but are neighboring pixels in the same observation direction. In the case of this embodiment, 25 pieces of the each observation direction image data (RGB) corresponding to the observation direction number are generated. These pieces of image data are stored, for the respective RGB channels, in the each observation direction image data storage component 104(R), the each observation direction image data storage component 105(G), and the each observation direction image data storage component 106(B).
  • each observation direction image data generation component performs, if required, a color matching processing to match the generated each observation direction image data for the respective channels with the color reproduction range of the image formation apparatus 200.
  • Step S204 half tone processing components 107, 108, and 109 for the respective RGB channels read the respective 25 pieces of each observation direction image data from each observation direction image data storage components 104, 105, and 106 to subject the data to a half tone processing based on the area gradation.
  • the half tone processing is performed in the RGB color space in each observation direction, thereby generating each observation direction image data obtained by subjecting the RGB 3 channels ⁇ 25 pieces to the half tone processing.
  • These pieces of image data are stored, for the respective RGB channels, in after-half-tone-processing each observation direction image data storage components 110, 111, and 112.
  • Fig. 8D illustrates an example of the each observation direction image data obtained by subjecting the each observation direction image for each channel shown in Fig.
  • the half tone processing by the half tone processing components 107, 108, and 109 is performed by subjecting the respective RGB channels to an area gradation processing using a similar 2 ⁇ 2 area gradation mask (hereinafter referred to as a mask).
  • a mask 2 ⁇ 2 area gradation mask
  • the same mask was used by the half tone processing components 107, 108, and 109 for the respective RGB colors.
  • the invention is not limited to this.
  • a mask obtained by rotating by 90° relative to the mask shown in Fig. 11 may be used
  • a mask obtained by rotating by 180° relative to the mask shown in Fig. 11 may be used.
  • Fig. 12A to Fig. 12F illustrate the processing process example in this case.
  • An example has been described in which the image formation apparatus 200 is a printer that is binary (per an ink color) and the half tone processing components 107, 108, and 109 perform a half tone processing for a binary output.
  • the invention is not limited to this.
  • the half tone processing components 107, 108, and 109 perform a half tone processing for a multi-value output.
  • the half tone processing is not limited to the area gradation processing using a mask and the error diffusion method also may be used for example.
  • Step S205 integrated image data generation components 113, 114, and 115 to the respective RGB channels read the respective 25 pieces of the each observation direction image data subjected to the half tone processing from the each observation direction image data storage components 104, 105, and 106.
  • the read image data is integrated for each RGB channel to generate integrated image data corresponding to the pixel layout in the input image data.
  • integrated image data is generated that is an image pattern to perform a coloring output on a substantially-plane surface having a cyclic and regular minute concave face structure.
  • These pieces of integrated image data are stored in the integrated image data storage component 116(R), the integrated image data storage component 117(G), and the integrated image data storage component 118(B) for the respective RGB channels.
  • Fig. 9 illustrates a processing example in which each observation direction image data is integrated to generate integrated image data.
  • the integrated image data is an image pattern used to color a substantially-plane surface having a minute concave face structure. This data is obtained by converting 25 pieces of each observation direction image data as shown in Fig. 9(Reference character (a)) to one piece of integrated image data as shown in Fig. 9(Reference character (b)) to correspond to the layout on the output target surface.
  • Reference character (c) of Fig.9 is a partially-enlarged view of the integrated image data and illustrates a 4 ⁇ 4 region based on a basic zone of a cyclic and regular minute concave face structure as a unit.
  • the integrated image data generated by the integrated image data generation component represents a color to be presented in each observation direction by the area gradation using neighboring pixels in the same observation direction.
  • the integrated image data generated by the integrated image data generation component is as shown in Fig. 8E.
  • the color conversion component 119 subjects the RGB integrated image data to the color conversion from the RGB color space as an additive color mixing system to a color space for the coloring output recording material.
  • the color conversion to a CMYK color space as a subtractive color mixing system is performed.
  • the color conversion component 119 reads the RGB integrated image data from the integrated image data storage components 116, 117, and 118 and subjects the data to a color conversion to thereby generate the output integrated image data for the respective CMYK channels.
  • the generated output integrated image data is stored in output integrated image data storage components 120, 121, 122, and 123 for the respective CMYK channels.
  • the output integrated image data for the respective CMYK channels is as shown in Fig. 8F.
  • Step S207 the image formation apparatus 200 reads the output integrated image data (CMYK) from the output integrated image data storage components 120, 121, 122, and 123 for the respective CMYK channels and subjects the output target surface to a coloring output.
  • CMYK output integrated image data
  • the area gradation is realized in neighboring pixels in the same observation direction to thereby prevent the color wraparound in other observation directions, realizing a multi-gradation.
  • each observation direction image data is generated and a half tone processing is performed within the each observation direction image data to thereby realize a 5 value gradation (per an ink color) in spite of the use of a binary image formation apparatus (per an ink color).
  • the half tone processing (area gradation processing) in the each observation direction image data is performed in an RGB space that can be subjected to an additive color mixing. This consequently provides a color reproduction by combining regions that are in the same observation direction and that are away from one another on the output target surface.
  • the following section will describe the meaning and effect of the half tone processing (area gradation processing) in the each observation direction image data performed in the RGB space that can be subjected to an additive color mixing.
  • Fig. 13A to Fig. 13E illustrate, as a comparison example of this embodiment, a processing process example in a case where a half tone processing is carried out in a CMYK color space as a subtractive color mixing.
  • Fig. 8A to Fig. 8F showing the processing process example in this embodiment
  • the same processing as those of Fig. 8A to Fig. 8F is performed until the each observation direction image data (RGB) as in Fig. 13B is generated.
  • the each observation direction image data (RGB) is converted to a CMYK color space. Specifically, 25 pieces of each observation direction image data (RGB) as shown in Fig.
  • CMYK that is a subtractive color mixing system frequently used in an image formation apparatus, thereby generating each observation direction image data subjected to the color conversion for the respective CMYK colors as shown in Fig. 13C.
  • the half tone processing within the each observation direction image data is performed in a CMYK color space as a subtractive color mixing to generate the each observation direction image data (CMYK) subjected to the half tone processing as shown in Fig. 13D.
  • the output integrated image data (CMYK) as shown in Fig. 13E is generated.
  • four types of masks were used that were obtained by rotating the mask of Fig. 11 used as the processing example of this embodiment by 90° relative to the respective CMYK colors, respectively.
  • a desired coloring output cannot be obtained.
  • Fig. 14A to Fig. 14B illustrate a desired coloring output cannot be obtained in this case.
  • Fig. 13A to Fig. 13E illustrate the arrangement of the RGB image data (Fig. 13A) describing the variable angle reflection color characteristic as an input and the CMYK output integrated image data (Fig. 13E).
  • the processing is performed in the color space (RGB color space) that can be subjected to an additive color mixing.
  • RGB color space RGB color space
  • no problem is caused at all even in an area gradation based on a combination of positions away from one another.
  • a coloring output can be performed that causes "the individual color tone in the each observation direction" to be different at each position in the coloring face while providing high gradation number and a high favorable color reproducibility.
  • the output gradation number is increased.
  • the half tone processing within the each observation direction image data was performed within the RGB color space.
  • the invention is not limited to this.
  • the half tone processing within the each observation direction image data may be performed in a color space that can be subjected to an additive color mixing.
  • an output target surface subjected to a coloring output is not limited to a substantially-plane surface having a cyclic and minute concave face structure and also may be a three-dimensional curved surface having a cyclic and minute concave face structure.
  • input image data is image data defining, with regard to the three-dimensional curved surface, the observation directions and the presented colors at the respective positions in the curved surface (image data describing the variable angle reflection color characteristic on the three-dimensional curved surface).
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD) TM ), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)
  • Ink Jet (AREA)
  • Color, Gradation (AREA)
  • Image Processing (AREA)

Abstract

En mettant en œuvre un traitement demi-teinte en tenant compte de la direction d'observation au niveau de chaque position d'une surface cible de sortie, un nombre élevé de gradations et une bonne reproductibilité de couleurs sont obtenus tout en évitant le repliement de couleur dans d'autres directions d'observation. La présente invention comprend une unité d'acquisition permettant d'acquérir des données d'image d'entrée pour soumettre une surface consistant en une structure tridimensionnelle cyclique et infime à une formation d'image, les données d'image d'entrée décrivant les couleurs des pixels respectifs à associer à des informations de directions d'observation correspondant aux pixels ; et une unité de traitement demi-teinte permettant de soumettre les données d'image d'entrée à des traitements demi-teinte dans les directions d'observation respectives de sorte que des pixels voisins entre eux dans la direction d'observation soient considérés comme pixels voisins pour générer ainsi des données d'image utilisées pour la formation d'image.
PCT/JP2017/003923 2016-02-08 2017-02-03 Appareil de traitement d'image, procédé de traitement d'image et support d'informations WO2017138446A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016021710A JP2017143333A (ja) 2016-02-08 2016-02-08 画像処理装置及び画像処理方法
JP2016-021710 2016-02-08

Publications (1)

Publication Number Publication Date
WO2017138446A1 true WO2017138446A1 (fr) 2017-08-17

Family

ID=58185580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/003923 WO2017138446A1 (fr) 2016-02-08 2017-02-03 Appareil de traitement d'image, procédé de traitement d'image et support d'informations

Country Status (2)

Country Link
JP (1) JP2017143333A (fr)
WO (1) WO2017138446A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006041812A2 (fr) * 2004-10-05 2006-04-20 Threeflow, Inc. Procede de production d'images lenticulaires ameliorees
WO2013160900A1 (fr) * 2012-04-25 2013-10-31 Humaneyes Technologies Ltd. Procédés et systèmes de production d'un article lenticulaire à l'aide d'un blanchet d'impression
US20140177008A1 (en) * 2012-09-05 2014-06-26 Lumenco, Llc Pixel mapping and printing for micro lens arrays to achieve dual-axis activation of images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006041812A2 (fr) * 2004-10-05 2006-04-20 Threeflow, Inc. Procede de production d'images lenticulaires ameliorees
WO2013160900A1 (fr) * 2012-04-25 2013-10-31 Humaneyes Technologies Ltd. Procédés et systèmes de production d'un article lenticulaire à l'aide d'un blanchet d'impression
US20140177008A1 (en) * 2012-09-05 2014-06-26 Lumenco, Llc Pixel mapping and printing for micro lens arrays to achieve dual-axis activation of images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Printing Reflectance Functions", ACM TRANSACTIONS ON GRAPHICS, vol. 31
TOM MALZBENDER ET AL: "Printing reflectance functions", ACM TRANSACTIONS ON GRAPHICS (TOG), vol. 31, no. 3, 1 May 2012 (2012-05-01), US, pages 1 - 11, XP055360483, ISSN: 0730-0301, DOI: 10.1145/2167076.2167078 *

Also Published As

Publication number Publication date
JP2017143333A (ja) 2017-08-17

Similar Documents

Publication Publication Date Title
JP5014475B2 (ja) 画像処理装置および画像処理方法
JP5634154B2 (ja) 画像処理装置および画像処理方法
US10511740B2 (en) Image processing apparatus, method thereof, and image forming apparatus that determine a dot arrangement of printing material by halftone processing
JP2007306550A (ja) データ処理装置、データ処理方法およびプログラム
US9652699B2 (en) Image processing apparatus and image processing method
US9147140B2 (en) Image processing apparatus, method, and product for converting image data into fewer gradations based on total value of pixels in a group except for a detected pixel having a specific value
WO2017138446A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et support d'informations
US9989875B2 (en) Image processing apparatus, image processing method, and storage medium
US10308040B2 (en) Image processing apparatus, image processing method, and storage medium
US9144994B2 (en) Printing device and printing method
US9851651B2 (en) Image processing apparatus and image processing method to control the gloss of an image to be printed
JP2015023378A (ja) 画像処理装置、画像処理方法およびプログラム
US11508034B2 (en) White background protection in SRGAN based super resolution
JP6171727B2 (ja) 画像処理装置、シート、コンピュータプログラム
JP4720123B2 (ja) 印刷制御装置、印刷制御方法及び印刷制御プログラム
US10070014B2 (en) Print data processing method and apparatus reducing ink applied in erosion region among white-plate data
JP6765103B2 (ja) 画像処理装置、画像処理プログラム及び画像処理方法
JP2014128901A (ja) 画像処理装置および画像処理方法
US8482825B2 (en) Image processing apparatus and image processing method
JP2012045785A (ja) 画像形成装置および画像形成方法
JP5697425B2 (ja) 画像処理装置および画像処理方法
US20180297385A1 (en) Apparatus, method, and storage medium for determining recording amount of recording material
JP2017016648A (ja) 画像処理装置、画像処理方法およびプログラム
JP2014236483A (ja) 画像処理装置、画像処理方法およびプログラム
JP6701845B2 (ja) 画像処理装置及び画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17707687

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17707687

Country of ref document: EP

Kind code of ref document: A1