JP2014178211A - Colorimetric device and image forming apparatus - Google Patents

Colorimetric device and image forming apparatus Download PDF

Info

Publication number
JP2014178211A
JP2014178211A JP2013052540A JP2013052540A JP2014178211A JP 2014178211 A JP2014178211 A JP 2014178211A JP 2013052540 A JP2013052540 A JP 2013052540A JP 2013052540 A JP2013052540 A JP 2013052540A JP 2014178211 A JP2014178211 A JP 2014178211A
Authority
JP
Japan
Prior art keywords
color
unit
colorimetric
patch
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2013052540A
Other languages
Japanese (ja)
Other versions
JP6094286B2 (en
Inventor
Nobuyuki Sato
信行 佐藤
Yasuyuki Suzuki
泰之 鈴木
Hideaki Suzuki
英明 鈴木
Satoshi Iwanami
智史 岩波
Satoshi Hirata
聡 平田
Original Assignee
Ricoh Co Ltd
株式会社リコー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd, 株式会社リコー filed Critical Ricoh Co Ltd
Priority to JP2013052540A priority Critical patent/JP6094286B2/en
Publication of JP2014178211A publication Critical patent/JP2014178211A/en
Application granted granted Critical
Publication of JP6094286B2 publication Critical patent/JP6094286B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

Provided are a color measuring device and an image forming apparatus capable of performing color measurement with high accuracy in a wide color gamut according to the spectral sensitivity of human eyes.
A colorimetric camera 20 according to an embodiment includes an illumination light source 30 that illuminates a patch 200 that is a color object, a two-dimensional image sensor 28 that captures an image of the patch 200 illuminated by the illumination light source 30, and an illumination light source. The filter unit 31 disposed at a position where both the illumination light directed from the patch 30 toward the patch 200 and the reflected light reflected by the patch 200 and directed toward the two-dimensional image sensor 28 pass, and is output from the two-dimensional image sensor 28. A colorimetric calculation unit that calculates a colorimetric value of the patch 200 based on the image data of the patch 200. The filter unit 31 has three filter regions, and these three filter regions have a spectral transmittance of the square root of the spectral transmittance having a linear transformation relationship with the color matching function.
[Selection] Figure 4-2

Description

  The present invention relates to a color measuring device and an image forming apparatus.

  In an image forming apparatus such as a printer, processing called color management is performed in order to improve output reproducibility with respect to input by suppressing variations in output due to device-specific characteristics. Color management improves the reproducibility of an output image by performing color conversion between a standard color space and a device-dependent color based on a device profile (ICC profile) that describes device-specific characteristics. When generating or modifying a device profile, an image forming apparatus actually forms a test pattern in which color charts (patches) of a large number of reference colors are arranged on a recording medium, and applies to each patch included in the test pattern. Perform colorimetry.

  A spectrocolorimeter is widely used as a color measuring device for measuring the color of a patch. Since the spectral colorimeter can obtain spectral reflectance for each wavelength, it can perform highly accurate color measurement. However, since the spectrocolorimeter is an expensive device, it is desired to perform high-precision colorimetry using a cheaper device.

  As an example of a method for realizing high-precision color measurement at low cost, an image of the color object to be measured is taken as an image, and the RGB value of the color object to be measured obtained by the imaging is converted into a color value in the standard color space. It is done. For example, in Patent Document 1, a two-dimensional image sensor is used to simultaneously image a colorimetric target patch that is a color object and a reference chart unit that includes a plurality of patches whose colorimetric values are known in advance. In addition, a colorimetric device configured to calculate a colorimetric value of a colorimetric target patch based on the RGB value of the colorimetric target patch obtained by imaging and the RGB value of the patch included in the reference chart portion is described.

  However, in the color measurement device described in Patent Document 1, it is difficult to perform color measurement in a wide color gamut that matches the spectral sensitivity of the human eye, and improvement has been demanded.

  As a technique for expanding the color gamut of the imaging apparatus, for example, it is known to attach a filter having spectral characteristics as described in Patent Document 2 to the imaging apparatus. However, the technique described in Patent Document 2 is based on the premise that an image of a subject illuminated by external light is captured, and a color measurement object is illuminated with an internal light source as in the color measurement device described in Patent Document 1. It cannot be applied to the color measuring device.

  The present invention has been made in view of the above, and provides a color measurement device and an image forming apparatus capable of performing color measurement with high accuracy in a wide color gamut in accordance with the spectral sensitivity of the human eye. Objective.

  In order to solve the above-described problems and achieve the object, the present invention provides a light source that illuminates a color object to be measured, a two-dimensional image sensor that images the color object to be measured illuminated by the light source, and the light source. A filter unit disposed at a position through which both the illumination light directed from the measured color object to the measured color body and the reflected light reflected by the measured color object and directed to the two-dimensional image sensor pass, and the two-dimensional image sensor An arithmetic unit that calculates a colorimetric value of the color object to be measured based on the image data of the color object to be output from the filter unit, and the filter unit includes at least three filter regions, The three filter regions are characterized by having a spectral transmittance of the square root of the spectral transmittance having a linear conversion relationship with the color matching function.

  According to the present invention, there is an effect that colorimetry can be performed with high accuracy in a wide color gamut that matches the spectral sensitivity of the human eye.

FIG. 1 is a perspective view illustrating the inside of the image forming apparatus. FIG. 2 is a top view showing an internal mechanical configuration of the image forming apparatus. FIG. 3 is a diagram for explaining an arrangement example of the recording heads mounted on the carriage. FIG. 4A is a longitudinal sectional view of the colorimetric camera according to the first embodiment (cross-sectional view taken along line X1-X1 in FIG. 4C). FIG. 4-2 is a longitudinal sectional view (a sectional view taken along line X2-X2 in FIG. 4-3) of the colorimetric camera of the first embodiment. FIG. 4C is a top view of the colorimetric camera according to the first embodiment viewed from the substrate side. FIG. 4-4 is a plan view of the bottom surface portion of the housing of the color measurement camera according to the first embodiment viewed in the X3 direction in FIG. 4-1. FIG. 5 is a perspective view of the dust-proof glass on which the filter portion is formed. FIG. 6 is a diagram for explaining the spectral transmittance of the filter unit. FIG. 7 is a control block diagram of the image forming apparatus. FIG. 8 is a control block diagram of the colorimetric camera of the first embodiment. FIG. 9 is a diagram illustrating the filter illumination effective range. FIG. 10 is a diagram for explaining processing for obtaining a reference colorimetric value and a reference RGB value and processing for generating a reference value linear transformation matrix. FIG. 11 is a diagram for explaining the outline of the color measurement process. FIG. 12 is a diagram for explaining basic colorimetry processing. FIG. 13 is a diagram for explaining basic colorimetry processing. FIG. 14A is a vertical cross-sectional view (a cross-sectional view at the same position as FIG. 4A) of the colorimetric camera of the second embodiment. FIG. 14B is a longitudinal sectional view (a sectional view at the same position as FIG. 4-2) of the colorimetric camera of the second embodiment. FIG. 14C is a plan view of the bottom surface of the housing of the color measurement camera according to the second embodiment when viewed in the same direction as FIG. FIG. 15 is a control block diagram of the colorimetric camera of the second embodiment. FIG. 16 is a plan view of the bottom surface of the housing of the color measurement camera according to the third embodiment when viewed in the same direction as FIG. 4-4. FIG. 17 is a perspective view of the dust-proof glass on which the filter portion is formed. FIG. 18 is a diagram for explaining the spectral transmittance of the filter unit. FIG. 19 is a control block diagram of the colorimetric camera of the third embodiment. FIG. 20 is a diagram illustrating processing by the spectral estimation calculation unit. FIG. 21 is a plan view of the bottom surface of the housing of the color measurement camera according to the fourth embodiment when viewed in the same direction as FIG. 4-4. FIG. 22 is a control block diagram of the colorimetric camera of the fourth embodiment. FIG. 23 is a control block diagram of the colorimetric camera of the fifth embodiment. FIG. 24-1 is a longitudinal cross-sectional view (cross-sectional view at the same position as in FIG. 4A) of the colorimetric camera of the sixth embodiment. FIG. 24-2 is a plan view of the bottom surface of the housing of the colorimetric camera of the sixth embodiment when viewed in the same direction as FIG. 4-4. FIG. 25 is a control block diagram of the colorimetric camera of the sixth embodiment. FIG. 26A is a longitudinal sectional view of the colorimetric camera of the seventh embodiment (a sectional view at the same position as FIG. 4A). FIG. 26B is a plan view of the bottom portion of the casing of the color measurement camera according to the seventh embodiment when viewed in the same direction as FIG. FIG. 27 is a control block diagram of the colorimetric camera of the seventh embodiment.

  Exemplary embodiments of a color measuring device and an image forming apparatus according to the present invention will be described below in detail with reference to the accompanying drawings.

<First Embodiment>
First, the mechanical configuration of the image forming apparatus 100 of the present embodiment will be described with reference to FIGS. 1 to 3. FIG. 1 is a perspective view illustrating the inside of the image forming apparatus 100 according to the present embodiment. FIG. 2 is a top view illustrating the internal mechanical configuration of the image forming apparatus 100 according to the present embodiment. FIG. 5 is a diagram for explaining an example of the arrangement of the recording heads 6 mounted on the recording medium.

  As shown in FIG. 1, the image forming apparatus 100 according to the present embodiment reciprocates in the main scanning direction (arrow A direction in the figure) and is intermittently conveyed in the sub-scanning direction (arrow B direction in the figure). A carriage (support) 5 for forming an image on the recording medium P is provided. The carriage 5 is supported by a main guide rod 3 extending along the main scanning direction. The carriage 5 is provided with a connecting piece 5a. The connecting piece 5 a engages with the sub guide member 4 provided in parallel with the main guide rod 3 to stabilize the posture of the carriage 5.

  As shown in FIG. 2, the carriage 5 includes a recording head 6y that discharges yellow (Y) ink, a recording head 6m that discharges magenta (M) ink, a recording head 6c that discharges cyan (C) ink, and black. (Bk) A plurality of recording heads 6k that discharge ink (hereinafter, the recording heads 6y, 6m, 6c, and 6k are collectively referred to as recording heads 6) are mounted. The recording head 6 is supported by the carriage 5 such that the ejection surface (nozzle surface) faces downward (recording medium P side).

  A cartridge 7, which is an ink supply body for supplying ink to the recording head 6, is not mounted on the carriage 5 and is disposed at a predetermined position in the image forming apparatus 100. The cartridge 7 and the recording head 6 are connected by a pipe (not shown), and ink is supplied from the cartridge 7 to the recording head 6 through this pipe.

  The carriage 5 is connected to a timing belt 11 stretched between a driving pulley 9 and a driven pulley 10. The drive pulley 9 is rotated by driving the main scanning motor 8. The driven pulley 10 has a mechanism for adjusting the distance to the driving pulley 9 and has a role of applying a predetermined tension to the timing belt 11. The carriage 5 reciprocates in the main scanning direction when the timing belt 11 is fed by driving the main scanning motor 8. The movement of the carriage 5 in the main scanning direction is controlled based on an encoder value obtained when the encoder sensor 13 provided on the carriage 5 detects a mark on the encoder sheet 14 as shown in FIG.

  In addition, the image forming apparatus 100 according to the present embodiment includes a maintenance mechanism 15 for maintaining the reliability of the recording head 6. The maintenance mechanism 15 performs cleaning and capping of the ejection surface of the recording head 6 and discharging unnecessary ink from the recording head 6.

  As shown in FIG. 2, a platen 16 is provided at a position facing the ejection surface of the recording head 6. The platen 16 is for supporting the recording medium P when ink is ejected from the recording head 6 onto the recording medium P. The image forming apparatus 100 according to the present embodiment is a wide-width machine in which the moving distance of the carriage 5 in the main scanning direction is long. For this reason, the platen 16 is configured by connecting a plurality of plate-like members in the main scanning direction (movement direction of the carriage 5). The recording medium P is sandwiched between transport rollers driven by a sub scanning motor (not shown), and is intermittently transported on the platen 16 in the sub scanning direction.

  The recording head 6 includes a plurality of nozzle arrays, and forms an image on the recording medium P by ejecting ink from the nozzle arrays onto the recording medium P transported on the platen 16. In this embodiment, in order to secure a large width of an image that can be formed on the recording medium P by one scan of the carriage 5, as shown in FIG. 3, the upstream recording head 6 and the downstream recording head are provided on the carriage 5. The head 6 is mounted. Further, the recording head 6k that discharges black ink is mounted on the carriage 5 as many times as the recording heads 6y, 6m, and 6c that discharge color ink. The recording heads 6y and 6m are arranged separately on the left and right. This is because the color stacking order is adjusted by the reciprocating operation of the carriage 5 so that the color does not change between the forward path and the return path. The arrangement of the recording heads 6 shown in FIG. 3 is an example, and the arrangement is not limited to the arrangement shown in FIG.

  Each of the above-described constituent elements constituting the image forming apparatus 100 of the present embodiment is disposed inside the exterior body 1. The exterior body 1 is provided with a cover member 2 that can be opened and closed. At the time of maintenance of the image forming apparatus 100 or when a jam occurs, the cover member 2 can be opened to perform work on each component provided inside the exterior body 1.

  The image forming apparatus 100 according to the present embodiment intermittently conveys the recording medium P in the sub-scanning direction, and moves the carriage 5 in the main scanning direction while the conveyance of the recording medium P in the sub-scanning direction is stopped. However, ink is ejected from the nozzle row of the recording head 6 mounted on the carriage 5 onto the recording medium P on the platen 16 to form an image on the recording medium P.

  In particular, when adjusting the color of the image forming apparatus 100, ink is actually ejected from the nozzle row of the recording head 6 mounted on the carriage 5 onto the recording medium P on the platen 16, and a large number of patches. A test pattern in which (measured color bodies) 200 are arranged is formed. Then, color measurement is performed on each patch 200 included in the test pattern. Each patch 200 included in the test pattern is an image obtained by outputting a reference color patch from the image forming apparatus 100, and reflects characteristics unique to the image forming apparatus 100. Accordingly, a device profile describing characteristics unique to the image forming apparatus 100 can be generated or corrected using the colorimetric values of the patches 200. Then, by performing color conversion between the standard color space and the device-dependent color based on this device profile, the image forming apparatus 100 can output an image with high reproducibility.

  The image forming apparatus 100 according to the present embodiment includes a color measurement camera (color measurement device) 20 for performing color measurement on each patch 200 included in a test pattern formed on a recording medium P. As shown in FIG. 2, the colorimetric camera 20 is fixed to the carriage 5 and reciprocates in the main scanning direction together with the carriage 5. When the color measurement of the patch 200 is performed, the recording medium P on which the test pattern is formed is set on the platen 16. In this state, the carriage 5 moves in the main scanning direction on the recording medium P, and the colorimetric camera 20 moved to a position facing the patch 200 as the carriage 5 moves takes an image of the patch 200. Then, the colorimetric camera 20 calculates the colorimetric value of the patch 200 based on the image data (RGB value) of the patch 200 obtained by imaging.

  Next, the mechanical configuration of the colorimetric camera 20 of this embodiment will be described. 4A to 4D are diagrams illustrating an example of the mechanical configuration of the colorimetric camera 20. FIG. 4A is a longitudinal sectional view of the colorimetric camera 20 (X1-X in FIG. 4-3). 4-2 is a longitudinal sectional view of the colorimetric camera 20 (cross-sectional view taken along line X2-X2 in FIG. 4-3), and FIG. 4-3 is a view of the colorimetric camera 20 viewed from the substrate side. FIG. 4-4 is a plan view of the bottom surface of the housing as viewed in the X3 direction in FIG. 4-1. FIG. 5 is a perspective view of the dust-proof glass on which the filter portion is formed.

  As shown in FIGS. 4A and 4B, the colorimetric camera 20 includes a housing 23 configured by combining a frame body 21 and a substrate 22. The frame body 21 is formed in a bottomed cylindrical shape in which one end side which is the upper surface of the housing 23 is opened. The substrate 22 is fastened to the frame body 21 by a fastening member (not shown) and integrated with the frame body 21 so as to close the open end of the frame body 21 and form the upper surface of the housing 23.

  The housing 23 is fixed to the carriage 5 so that the bottom surface portion 23a faces the recording medium P on the platen 16 with a predetermined gap d. The direction of arrow A in FIG. 4A is the main scanning direction. An opening 25 for enabling the patch 200 formed on the recording medium P to be photographed from the inside of the casing 23 is provided on the bottom surface portion 23 a of the casing 23 facing the recording medium P. The opening 25 is closed with, for example, a highly transparent dustproof glass 26. The dust-proof glass 26 prevents foreign matter from entering the housing 23 through the opening 25. The size of the gap d is, for example, about 1 to 2 mm.

  A sensor unit 27 that captures an image is provided inside the housing 23. The sensor unit 27 includes a two-dimensional image sensor 28 such as a CCD sensor or a CMOS sensor, and an imaging lens 29 that forms an optical image in the imaging range of the sensor unit 27 on the sensor surface of the two-dimensional image sensor 28. For example, the two-dimensional image sensor 28 is mounted on the inner surface (component mounting surface) of the substrate 22 so that the sensor surface faces the bottom surface 23 a side of the housing 23. The imaging lens 29 is fixed in a state of being positioned with respect to the two-dimensional image sensor 28 so as to maintain a positional relationship determined according to the optical characteristics.

  An illumination light source 30 that illuminates the patch 200 outside the housing 23 when the sensor unit 27 images the patch 200 outside the housing 23 through the opening 25 is provided inside the housing 23. For example, an LED (Light Emitting Diode) is used as the illumination light source 30. In the present embodiment, two LEDs are used as the illumination light source 30. As shown in FIG. 4C, these two LEDs used as the illumination light source 30 are arranged at symmetrical positions around the two-dimensional image sensor 28 of the sensor unit 27. For example, together with the two-dimensional image sensor 28, the substrate 22 is mounted on the inner surface. The illumination light source 30 only needs to be disposed at a position where the patch 200 outside the housing 23 can be illuminated via the opening 25, and is not necessarily mounted directly on the substrate 22. Moreover, in this embodiment, although LED is used as the illumination light source 30, the kind of light source is not limited to LED. For example, an organic EL or the like may be used as the illumination light source 30. When the organic EL is used as the illumination light source 30, illumination light close to the spectral distribution of sunlight can be obtained, so that improvement in colorimetric accuracy can be expected.

  In addition, the colorimetric camera 20 of the present embodiment includes illumination light directed from the illumination light source 30 toward the patch 200 that is a color measurement object outside the housing 23, and the two-dimensional image sensor 28 of the sensor unit 27 that is reflected by the patch 200. The filter unit 31 is provided at a position where both the reflected light traveling toward the light passes through. The filter unit 31 is formed, for example, by depositing a multilayer film on the dustproof glass 26 that covers the opening 25 of the housing 23. The filter unit 31 may be formed separately from the dust-proof glass 26 and may be attached to the dust-proof glass 31 by adhesion or the like. The filter unit 31 is formed on the dust-proof glass 26 so that both the illumination light and the reflected light pass in a state where the dust-proof glass 26 is attached to the housing 23. In addition, the filter part 31 should just be arrange | positioned in the position through which both the said illumination light and the said reflected light pass, and does not necessarily need to be formed on the dust-proof glass 31. FIG.

  As shown in FIGS. 4-4 and 5, the filter unit 31 has three filter regions 31R, 31G, and 31B corresponding to three colors of RGB. A mask portion 32 is provided at the boundary between the three filter regions 31R, 31G, and 31B so that the illumination light from the illumination light source 30 does not overlap between the filter regions 31R, 31G, and 31B. .

  The three filter regions 31R, 31G, and 31B of the filter unit 31 have a spectral transmittance that is a square root of the spectral transmittance having a linear conversion relationship with the color matching function.

  FIG. 6 is a diagram for explaining the spectral transmittance of the filter unit 31. The broken line graphs (B ′, G ′, R ′) in the figure indicate spectral transmittances having a linear transformation relationship with the color matching function, and the spectral transmittances shown in FIG. is there. The solid line graphs (B, G, R) in the figure indicate the spectral transmittance of the square root of the spectral transmittance having a linear conversion relationship with the color matching function. In the colorimetric camera 20 of the present embodiment, the filter unit 31 having the spectral transmittance shown by the solid line graph in this figure is arranged at a position through which both the illumination light and the reflected light pass.

  In the colorimetric camera 20 of the present embodiment, the illumination light from the illumination light source 30 passes through the filter regions 31R, 31G, and 31B of the filter unit 31 and is applied to the patch 200 that is a color measurement object. Then, the reflected light from the patch 200 passes through the filter regions 31R, 31G, and 31B of the filter unit 31 again, and passes through the imaging lens 29 of the sensor unit 27 on the sensor surface of the two-dimensional image sensor 28. Form an image. As described above, the light imaged on the sensor surface of the two-dimensional image sensor 28 is light that passes through the filter regions 31R, 31G, and 31B of the filter unit 31 twice. This is a value obtained by squaring the spectral transmittance of each filter region 31R, 31G, 31B of the unit 31. That is, the light from the patch 200 that is a color object to be measured illuminated by the illumination light source 30 enters the two-dimensional image sensor 28 as light having the characteristics shown by the broken line graph in FIG. The spectral characteristic shown by the broken line graph in FIG. 6 is related to the color matching function and the linear conversion (also referred to as satisfying the router condition). Therefore, by calculating the colorimetric value of the patch 200 using the image data output from the two-dimensional image sensor 28, colorimetry can be accurately performed in a wide color gamut that matches the sensitivity of the human eye.

  In the present embodiment, it is assumed that a black and white image sensor is used as the two-dimensional image sensor 28. However, when the two-dimensional image sensor 28 uses a color image sensor, for example, it becomes one unit of a Bayer array. What is necessary is just to add the data of four pixels and to consider this as one pixel data.

  Next, a schematic configuration of the control mechanism of the image forming apparatus 100 of the present embodiment will be described with reference to FIG. FIG. 7 is a control block diagram of the image forming apparatus 100.

  As shown in FIG. 7, the image forming apparatus 100 according to the present embodiment includes a CPU 101, a ROM 102, a RAM 103, a recording head driver 104, a main scanning driver 105, a sub scanning driver 106, a control FPGA (Field-Programmable Gate Array) 110, A recording head 6, a colorimetric camera 20, an encoder sensor 13, a main scanning motor 8, and a sub-scanning motor 12 are provided. The CPU 101, ROM 102, RAM 103, print head driver 104, main scanning driver 105, sub scanning driver 106, and control FPGA 110 are mounted on the main control board 120. The recording head 6, the encoder sensor 13, and the colorimetric camera 20 are mounted on the carriage 5 as described above.

  The CPU 101 governs overall control of the image forming apparatus 100. For example, the CPU 101 uses the RAM 103 as a work area, executes various control programs stored in the ROM 102, and outputs control commands for controlling various operations in the image forming apparatus 100.

  The recording head driver 104, the main scanning driver 105, and the sub scanning driver 106 are drivers for driving the recording head 6, the main scanning motor 8, and the sub scanning motor 12, respectively.

  The control FPGA 110 controls various operations in the image forming apparatus 100 in cooperation with the CPU 101. The control FPGA 110 includes, for example, a CPU control unit 111, a memory control unit 112, an ink ejection control unit 113, a sensor control unit 114, and a motor control unit 115 as functional components.

  The CPU control unit 111 communicates with the CPU 101 to transmit various information acquired by the control FPGA 110 to the CPU 101 and inputs a control command output from the CPU 101.

  The memory control unit 112 performs memory control for the CPU 101 to access the ROM 102 and the RAM 103.

  The ink discharge control unit 113 controls the operation of the print head driver 104 in accordance with a control command from the CPU 101, thereby controlling the discharge timing of ink from the print head 6 driven by the print head driver 104.

  The sensor control unit 114 performs processing on sensor signals such as encoder values output from the encoder sensor 13.

  The motor control unit 115 controls the main scanning motor 105 driven by the main scanning driver 105 by controlling the operation of the main scanning driver 105 in accordance with a control command from the CPU 101, and moves the carriage 5 in the main scanning direction. Control the movement of. In addition, the motor control unit 115 controls the sub-scanning motor 106 driven by the sub-scanning driver 106 by controlling the operation of the sub-scanning driver 106 in accordance with a control command from the CPU 101, and records on the platen 16. Controls the movement of the medium P in the sub-scanning direction.

  Each of the above-described units is an example of a control function realized by the control FPGA 110, and various other control functions may be realized by the control FPGA 110. Moreover, the structure which implement | achieves all or one part of said control function with the program run by CPU101 or another general purpose CPU may be sufficient. Further, a configuration in which a part of the control function is realized by dedicated hardware such as another FPGA different from the control FPGA 110 or an ASIC (Application Specific Integrated Circuit) may be used.

  The recording head 6 is driven by a recording head driver 104 whose operation is controlled by the CPU 101 and the control FPGA 110, and ejects ink onto the recording medium P on the platen 16 to form an image.

As described above, when the colorimetric camera 20 performs colorimetry on each patch 200 included in the test pattern formed on the recording medium P, the colorimetric camera 20 uses a two-dimensional image sensor 28 inside the housing 23 to provide an outside of the housing 23. The patch 200 is imaged, and based on the image data of the patch 200 (RGB values of the patch 200), the colorimetric values of the patch 200 (color values in the standard color space, for example, L in the L * a * b * color space * A * b * value (hereinafter, L * a * b * is expressed as Lab). The colorimetric values of the patch 200 calculated by the colorimetric camera 20 are sent to the CPU 101 via the control FPGA 110.

  The encoder sensor 13 outputs an encoder value obtained by detecting the mark on the encoder sheet 14 to the control FPGA 110. This encoder value is sent from the control FPGA 110 to the CPU 101 and used, for example, to calculate the position and speed of the carriage 5. The CPU 101 generates and outputs a control command for controlling the main scanning motor 8 based on the position and speed of the carriage 5 calculated from the encoder value.

  Next, a configuration example of the control mechanism of the colorimetric camera 20 will be specifically described with reference to FIG. FIG. 8 is a control block diagram of the colorimetric camera 20.

  As shown in FIG. 8, the colorimetric camera 20 includes a two-dimensional image sensor 28, an illumination light source 30, a light source controller 34, a timing signal generator 35, a frame memory 36, an area selector 37, an averaging processor 38, a measurement unit. A color calculation unit 39 and a nonvolatile memory 40 are provided. Each of these units is mounted on a substrate 22 that constitutes the upper surface of the housing 23 of the colorimetric camera 20, for example.

  The two-dimensional image sensor 28 converts light incident through the imaging lens 29 into an electrical signal and outputs image data in the imaging range. In particular, in the colorimetric camera 20 of the present embodiment, when measuring the patch 200, the illumination light from the illumination light source 30 passes through the filter unit 31 and is applied to the patch 200, which is the color object, and the reflected light again. The light passes through the filter unit 31 and enters the two-dimensional image sensor 28. The two-dimensional image sensor 28 photoelectrically converts this light into image data, thereby outputting high-accuracy image data that reproduces the color of the patch 200 in a wide color gamut that matches the sensitivity of the human eye. it can.

  The two-dimensional image sensor 28 converts an analog signal obtained by photoelectric conversion into digital image data, and various types of image data such as shading correction, white balance correction, γ correction, and image data format conversion. Built-in function to output after image processing. Note that various or various image processing on the image data may be performed outside or part of the two-dimensional image sensor 28.

  The light source control unit 34 generates a light source drive signal for driving the illumination light source 30 and supplies the light source drive signal to the illumination light source 30.

  The timing signal generator 35 generates a timing signal for controlling the timing of starting imaging by the two-dimensional image sensor 28 and supplies the timing signal to the two-dimensional image sensor 28.

  The frame memory 36 temporarily stores the image data output from the two-dimensional image sensor 28.

  The area selection unit 37 determines the filter illumination effective ranges corresponding to the filter areas 31R, 31G, and 31B of the filter unit 31 from the image data of the patch 200 output from the two-dimensional image sensor 28 and stored in the frame memory 36. Select the pixel data of the projected area. The filter illumination effective range is the illumination light emitted from one of the two LEDs used for the illumination light source 30 and passed through the filter regions 31R, 31G, 31B, and the filter region 31R, 31G, 31B emitted from the other. This is an area on the patch 200 that is irradiated with the passing illumination light.

  FIG. 9 is a diagram illustrating the filter illumination effective range. One of the illumination light sources 30 is 30a, the other is 30b, the range emitted from one illumination light source 30a and passing through the filter regions 31R, 31G, 31B and irradiating the patch 200 is the illumination range Ra, and the other illumination A range that is emitted from the light source 30b and passes through the filter regions 31R, 31G, and 31B and is applied to the patch 200 is defined as an illumination range Rb. The filter illumination effective range Rab is an area where the illumination range Ra and the illumination range Rb overlap on the patch 200. There are three filter illumination effective ranges Rab, one corresponding to the filter region 31R of the filter unit 31, one corresponding to the filter region 31G, and one corresponding to the filter region 31B. The filter unit 31 is provided with the above-described mask unit 32 so that these three filter illumination effective ranges Rab do not interfere with each other. The image data of the selected area may be selected.

  When a black and white image sensor is used as the two-dimensional image sensor 28, the filter illumination effective range Rab may be at least as large as an image area corresponding to the filter illumination effective range Rab. When a color image sensor is used as the two-dimensional image sensor 28, the filter illumination effective range Rab has at least four pixels in which the image region corresponding to the effective range is one unit of the Bayer array of the two-dimensional image sensor 28. Any size can be used. In other words, the filter part 31 should just determine the magnitude | size of each filter area | region 31R, 31G, 31B so that the filter illumination effective range Rab of the magnitude | size mentioned above may be ensured. In addition, in order to suppress the influence of a noise component etc., it is desirable that each filter area | region 31R, 31G, 31B of the filter part 31 has sufficient size. Further, the filter unit 31 can be easily created if the size of each of the filter regions 31R, 31G, and 31B is increased to some extent.

  The averaging processing unit 38 averages the pixel data of the area selected by the area selecting unit 37. The value obtained by averaging the pixel data of the region where the filter illumination effective range Rab corresponding to the filter region 31R is imaged by the averaging processing unit 38 is the R value of the patch 200. In addition, the value obtained by averaging the pixel data of the region in which the filter illumination effective range Rab corresponding to the filter region 31G is reflected by the averaging processing unit 38 is the G value of the patch 200. Further, the value obtained by averaging the pixel data of the region where the filter illumination effective range Rab corresponding to the filter region 31B is reflected by the averaging processing unit 38 is the B value of the patch 200. The averaging processing unit 38 outputs the RGB values of these patches 200 to the colorimetric calculation unit 39.

  The colorimetric calculation unit 39 uses, for example, a conversion matrix (referred to as a reference value linear conversion matrix) stored in advance in the nonvolatile memory 40 to convert the RGB value of the patch 200 input from the averaging processing unit 38 into a Lab value. Convert to colorimetric values such as The reference value linear transformation matrix is created, for example, by obtaining the relationship between RGB values and colorimetric values of a number of reference patches by the least square method or the like, and is stored in the nonvolatile memory 40. A specific example of the color measurement method of the patch 200 using the reference value linear transformation matrix will be described later in detail. In addition to using the reference value linear conversion matrix, for example, the colorimetric value of the patch 200 may be calculated using a conversion table.

  The nonvolatile memory 40 stores various data necessary for the colorimetric calculation unit 39 to calculate the colorimetric values of the patch 200.

  Here, an outline of the operation of the colorimetric camera 20 configured as described above will be described. When performing color measurement of the patch 200, the color measurement camera 20 is first positioned at a position where the opening 25 of the housing 23 faces the patch 200. Then, the illumination light source 30 inside the housing 23 is driven by the light source control unit 34. The illumination light emitted from the illumination light source 30 passes through the filter regions 31R, 31G, and 31B of the filter unit 31 and irradiates the patch 200, and the reflected light thereof is reflected in the filter regions 31R, 31G, and 31B of the filter unit 31. , And enters the two-dimensional image sensor 28 through the imaging lens 29 of the sensor unit 27.

  At this time, the timing signal is supplied from the timing signal generation unit 35 to the two-dimensional image sensor 28, whereby the two-dimensional image sensor 28 images the patch 200 and outputs image data. Image data output from the two-dimensional image sensor 28 is temporarily stored in the frame memory 36.

  Next, the area selection unit 37 selects pixel data of an area showing the filter illumination effective range corresponding to each of the filter areas 31R, 31G, and 31B of the filter unit 31 from the image data stored in the frame memory 36. . Then, the averaging processing unit 38 averages the pixel data selected by the region selecting unit 37 to obtain the RGB value of the patch 200. Then, the colorimetric calculation unit 39 uses the reference value linear conversion matrix stored in advance in the nonvolatile memory 40 to convert the RGB value of the patch 200 into the colorimetric value, and obtains the colorimetric value of the patch 200.

  Next, a specific example of the color measurement method of the patch 200 will be described with reference to FIGS. 10 to 13. The color measurement method described below performs pre-processing that is performed when the image forming apparatus 100 is in an initial state (when the image forming apparatus 100 is in an initial state due to manufacturing, overfall, or the like) and color adjustment of the image forming apparatus 100. Colorimetric processing performed at the time of adjustment.

  FIG. 10 is a diagram for explaining the process of acquiring the reference colorimetric value and the reference RGB value and the process of generating the reference value linear transformation matrix described above. These processes shown in FIG. 10 are implemented as pre-processing. In the preprocessing, a reference sheet KS on which a large number of reference patches KP are arranged is used.

  First, at least one of the Lab value and the XYZ value (both the Lab value and the XYZ value in the example of FIG. 10) that is a colorimetric value of each reference patch KP included in the reference sheet KS is the patch number. Are stored in a memory table Tb1 provided in a non-volatile memory 40 mounted on the substrate 22 of the colorimetric camera 20, for example. The colorimetric value of the reference patch KP is a value obtained in advance by colorimetry using the spectroscope BS or the like. If the colorimetric value of the reference patch KP is known, that value may be used. Hereinafter, the colorimetric values of the reference patch KP stored in the memory table Tb1 are referred to as “reference colorimetric values”.

  Next, by controlling the movement of the carriage 5 in a state where the reference sheet KS is set on the platen 16, the colorimetric camera 20 performs imaging using a plurality of reference patches KP of the reference sheet KS as subjects. Then, the RGB values of the reference patch KP obtained by imaging with the colorimetric camera 20 are stored in the memory table Tb1 of the nonvolatile memory 40 in correspondence with the patch numbers. That is, in the memory table Tb1, the colorimetric values and the RGB values of the large number of reference patches KP arranged on the reference sheet KS are stored in correspondence with the patch numbers of the respective reference patches KP. Hereinafter, the RGB values of the reference patch KP stored in the memory table Tb1 are referred to as “reference RGB values”. The reference RGB value is a value reflecting the characteristics of the colorimetric camera 20.

  When the reference colorimetric value and the reference RGB value of the reference patch KP are stored in the memory table Tb1 of the nonvolatile memory 40, the CPU 101 of the image forming apparatus 100 stores the XYZ value and the reference RGB that are the reference colorimetric values of the same patch number. A reference value linear conversion matrix for converting these values into each other is generated and stored in the nonvolatile memory 40. When only Lab values are stored as reference colorimetric values in the memory table Tb1, after converting Lab values to XYZ values using a known conversion formula for converting Lab values to XYZ values, a reference value linear conversion matrix Should be generated.

  After the above preprocessing is completed, the image forming apparatus 100 controls the main scanning motor 8, the sub-scanning motor 12, and the recording head under the control of the CPU 101 based on image data input from the outside, print settings, and the like. 6 is driven to form an image on the recording medium P by ejecting ink from the recording head 6 while intermittently transporting the recording medium P in the sub-scanning direction and moving the carriage 5 in the main scanning direction. . At this time, the amount of ink ejected from the recording head 6 may change depending on the characteristics inherent to the device or changes over time. When this ink ejection amount changes, the color differs from the color of the image intended by the user. As a result, the color reproducibility deteriorates. Therefore, the image forming apparatus 100 performs a color measurement process for obtaining a color measurement value of the patch 200 formed on the recording medium P at a predetermined timing for color adjustment. A device profile is generated or corrected based on the colorimetric value of the patch 200 obtained by the colorimetric processing, and color adjustment is performed based on the device profile, thereby improving the color reproducibility of the output image.

  FIG. 11 is a diagram for explaining the outline of the color measurement process. At the time of adjustment for performing color adjustment, the image forming apparatus 100 first ejects ink from the recording head 6 onto the recording medium P set on the platen 16, and a test pattern in which patches 200 that are color objects to be measured are arranged. Form. Hereinafter, the recording medium P on which the test pattern is formed is referred to as an “adjustment sheet CS”. The adjustment sheet CS is formed with a patch 200 that reflects output characteristics during adjustment of the image forming apparatus 100, in particular, output characteristics of the recording head 6. Note that image data for forming a test pattern is stored in advance in the nonvolatile memory 40 or the like.

  Next, as shown in FIG. 11, the image forming apparatus 100 sets the adjustment sheet CS on the platen 16 or holds the adjustment sheet CS on the platen 16 without discharging the sheet when the adjustment sheet CS is created. In the state, the patch 200 on the adjustment sheet CS is illuminated by the illumination light source 30 of the colorimetric camera 20 and imaged by the two-dimensional image sensor 28. Then, among the image data output from the two-dimensional image sensor 28, the pixel data of the region showing the filter illumination effective range is selected by the region selection unit 37, the pixel data is averaged by the averaging processing unit 38, The RGB value of the patch 200 is obtained. Hereinafter, the RGB value of the patch 200 that is a color measurement object is referred to as a “color measurement target RGB value”.

  Next, the colorimetric calculation unit 39 performs basic colorimetry processing described later on the colorimetry target RGB values (step S20), thereby acquiring Lab values that are colorimetric values of the patch 200.

  12 and 13 are diagrams for explaining basic colorimetric processing. The colorimetric calculation unit 39 first reads the reference value linear conversion matrix generated in the preprocessing and stored in the nonvolatile memory 40, and converts the colorimetric RGB values to the first XYZ values using the reference value linear conversion matrix. And stored in the nonvolatile memory 40 (step S21). FIG. 12 shows an example in which the colorimetric target RGB values (3, 200, 5) are converted to the first XYZ values (20, 80, 10) by the reference value linear conversion matrix.

  Next, the colorimetric calculation unit 39 converts the first XYZ value converted from the colorimetric target RGB value in step S21 into a first Lab value using a known conversion formula, and stores it in the nonvolatile memory 40 (step S21). S22). FIG. 12 shows an example in which the first XYZ values (20, 80, 10) are converted into the first Lab values (75, −60, 8) by a known conversion formula.

  Next, the colorimetric calculation unit 39 searches for a plurality of reference colorimetric values (Lab values) stored in the memory table Tb1 of the nonvolatile memory 40 in the preprocessing, and among the reference colorimetric values (Lab values) Then, a set of a plurality of patches (neighboring color patches) having a reference colorimetric value (Lab value) close to the first Lab value in the Lab space is selected (step S23). As a method for selecting a patch having a short distance, for example, the distance from the first Lab value is calculated for all reference colorimetric values (Lab values) stored in the memory table Tb1, and the patch is calculated for the first Lab value. A method of selecting a plurality of patches having Lab values that are close to each other (in FIG. 12, the Lab values that are hatched) can be used.

  Next, as shown in FIG. 13, the colorimetric calculation unit 39 refers to the memory table Tb1, and for each of the neighboring color patches selected in step S23, the RGB value (reference RGB) paired with the Lab value. Value) and XYZ values are extracted, and a combination of RGB values and XYZ values is selected from the plurality of RGB values and XYZ values (step S24). Then, the colorimetric calculation unit 39 obtains a selected RGB value linear conversion matrix for converting the RGB values of the selected combination (selected set) into XYZ values using the least square method or the like, and obtains the selected RGB value linear The conversion matrix is stored in the nonvolatile memory 40 (step S25).

  Next, the colorimetric calculation unit 39 converts the initialization colorimetric RGB values (RsGsBs) into the second XYZ values using the selected RGB value linear conversion matrix generated in step S25 (step S26). Further, the colorimetric calculation unit 39 converts the second XYZ value obtained in step S26 into a second Lab value using a known conversion formula (step S27), and uses the obtained second Lab value as a final value of the patch 200. Colorimetric values. The image forming apparatus 100 generates or corrects a device profile based on the colorimetric values obtained by the above colorimetric processing, and performs color adjustment based on the device profile, thereby improving the color reproducibility of the output image. be able to.

  As described above in detail with reference to specific examples, the colorimetric camera 20 of the present embodiment is reflected by the illumination light from the illumination light source 30 toward the patch 200 that is the color object and the patch 200. A filter unit 31 is provided at a position where both the reflected light toward the two-dimensional image sensor 28 passes. The filter unit 31 has three filter regions 31R, 31G, and 31B corresponding to three colors of RGB, and these three filter regions 31R, 31G, and 31B have spectral transmittances that have a linear conversion relationship with the color matching function. This is a configuration having a square root spectral transmittance. Therefore, in the colorimetric camera 20 of the present embodiment, the light from the patch 200 illuminated by the illumination light source 30 enters the two-dimensional image sensor 28 as light having spectral characteristics that have a linear conversion relationship with the color matching function. Thus, by calculating the colorimetric value of the patch 200 based on the image data of the patch 200 output from the two-dimensional image sensor 28, the patch 200 is measured with a wide color gamut that matches the spectral sensitivity of the human eye. Color can be accurately performed.

  Further, the image forming apparatus 100 according to the present embodiment generates or corrects a device profile based on the colorimetric value of the patch 200 calculated using the colorimetric camera 20 at the time of adjustment, and based on the device profile. By performing color adjustment, image formation with high reproducibility can be performed.

Second Embodiment
Next, the colorimetric camera 20 of the second embodiment will be described. Hereinafter, the colorimetric camera 20 of the second embodiment is referred to as a colorimetric camera 20A, and is distinguished from the colorimetric camera 20 of the first embodiment. Further, in the colorimetric camera 20A of the second embodiment, the same components as those of the colorimetric camera 20 of the first embodiment are denoted by the same reference numerals, and redundant description is omitted as appropriate. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  14A to 14C are diagrams illustrating an example of the mechanical configuration of the colorimetric camera 20A according to the second embodiment. FIG. 14A is a longitudinal sectional view of the colorimetric camera 20A (FIG. 4-A). 14-2 is a longitudinal sectional view of the colorimetric camera 20A (cross-sectional view at the same position as FIG. 4-2), and FIG. 14-3 is a housing 23 of the colorimetric camera 20A. It is the top view which looked at the bottom face part 23a of no in the same direction as FIG. 4-4.

  In the colorimetric camera 20 </ b> A of the second embodiment, a reference chart unit 41 having a large number of reference patches whose colorimetric values are known in advance is provided inside the housing 23. That is, in the colorimetric camera 20A of the second embodiment, as shown in FIGS. 14-1 and 14-3, the opening 25 is formed in a substantially half of the entire area of the bottom surface 23a of the housing 23. Has been. A reference chart portion 41 is arranged so as to be adjacent to the opening 25 in the main scanning direction. The reference chart unit 41 is imaged by the two-dimensional image sensor 28 of the sensor unit 27 together with the patch 200 when the patch 200 is colorimetrically measured. The RGB value of each reference patch of the reference chart unit 41 is a variation in the RGB value of the patch 200 due to a change in the light amount of the illumination light source 30 and a change in sensitivity of the two-dimensional image sensor 28 over time, as described later. Used to correct.

  As illustrated in FIGS. 14A to 14C, the opening 25 formed in the bottom surface 23 a of the housing 23 is blocked by the optical path length changing member 42. The optical path length changing member 42 is emitted from the illumination light source 30, reflected by the patch 200 that is a color object to be measured, and incident on the two-dimensional image sensor 28 of the sensor unit 27, and emitted from the illumination light source 30. The optical path length of the light reflected by the reference patch of the reference chart unit 41 and incident on the two-dimensional image sensor 28 of the sensor unit 27 is made substantially equal, and the reference patch of the patch 200 and the reference chart unit 41 is made by the two-dimensional image sensor 28. And an optical member that makes it possible to take an image in focus on both. In addition, as the reference | standard chart part 41 and the optical path length changing member 42, what is disclosed by patent document 1, for example can be used.

  In the colorimetric camera 20A of the second embodiment, the filter unit 31 is formed, for example, by depositing a multilayer film on the optical path length changing member 42 that covers the opening 25. The filter unit 31 may be formed separately from the optical path length changing member 42 and attached to the optical path length changing member 42 by bonding or the like. The filter unit 31 has the optical path length changing member 42 mounted on the housing 23, and the illumination light directed from the illumination light source 30 toward the patch 200 that is a color object to be measured, and reflected by the patch 200 and 2 of the sensor unit 27. The optical path length changing member 42 is formed so as to be disposed at a position through which both the reflected light toward the dimensional image sensor 28 passes. In addition, the filter part 31 should just be arrange | positioned in the position through which both the said illumination light and the said reflected light pass, and does not necessarily need to be formed on the optical path length change member 42. FIG.

  In the colorimetric camera 20 </ b> A of the second embodiment, it is assumed that a color image sensor is used as the two-dimensional image sensor 28. The filter unit 31 may not be provided between the reference chart unit 41 and the two-dimensional image sensor 28.

  The reference chart unit 41 is imaged by the two-dimensional image sensor 28 in the above-described initial processing. The RGB value of each reference patch of the reference chart unit 41 obtained by this imaging is stored in the memory table Tb1 of the nonvolatile memory 40 in association with the patch number. Hereinafter, the RGB value of each reference patch of the reference chart unit 41 acquired in this initial process is referred to as an initial reference RGB value.

  FIG. 15 is a control block diagram of the colorimetric camera 20A of the second embodiment. As shown in FIG. 15, the colorimetric camera 20A according to the second embodiment includes a patch area selection unit 45, an averaging processing unit 46, and a correction matrix generation unit 47 in addition to the configuration of the colorimetric camera 20 according to the first embodiment. And a correction calculation unit 49.

  The patch area selection unit 45 selects, for each reference patch, pixel data in the patch area in which each reference patch of the reference chart unit 41 is projected from the image data output from the two-dimensional image sensor 28 and stored in the frame memory 36. . Since the position of each reference patch in the reference chart unit 41 is fixed with respect to the two-dimensional image sensor 28, the patch area selection unit 45 displays pixel data at a predetermined position of the image data and displays each reference patch. Can be selected as pixel data of the patch area.

  The averaging processing unit 46 averages the pixel data of each patch region selected by the patch region selecting unit 45 for each patch region, and acquires the RGB value of each reference patch. The averaging processor 46 outputs the RGB values of these reference patches to the correction matrix generator 47.

  The correction matrix generation unit 47 uses the initial reference RGB values stored in advance in the non-volatile memory 40 and the RGB values of the respective reference patches input from the averaging processing unit 46, so that the patch 200 of the color object to be measured is used. A correction matrix (correction data) for correcting RGB values is generated. That is, the correction matrix generation unit 47 uses an estimation method such as a least square method between the initial reference RGB value read from the nonvolatile memory 40 and the RGB value of each reference patch input from the averaging processing unit 46. Thus, a conversion matrix for converting the RGB value of each reference patch into the initial reference RGB value is generated. The conversion matrix obtained by this processing is passed to the correction calculation unit 49 as a correction matrix for correcting the RGB values of the patch 200 that is the color to be measured.

  The correction calculation unit 49 corrects the RGB value of the patch 200 input from the averaging processing unit 38 using the correction matrix generated by the correction calculation unit 49. As described above, the correction matrix generated by the correction calculation unit 49 uses the RGB value of each reference patch of the reference chart unit 41 at the time of color measurement (adjustment) of the patch 200 as the initial reference RGB value measured in the initial process. This is a conversion matrix that converts to. Therefore, by correcting the RGB values of the patch 200 using this correction matrix, the influence of the change in the light amount of the illumination light source 30 and the change in sensitivity of the two-dimensional image sensor 28 over time from the initial processing time to the adjustment time is obtained. Can be relaxed.

  In the colorimetric camera 20A of the second embodiment, the colorimetric calculation unit 39 calculates the colorimetric value of the patch 200 from the RGB value of the patch 200 corrected by the correction calculation unit 49. Therefore, it is possible to reduce the influence of the light amount change of the illumination light source 30 and the sensitivity change of the two-dimensional image sensor 28 over time, and to calculate a highly accurate colorimetric value.

  As described above in detail, the colorimetric camera 20A of the second embodiment includes the filter unit 31 similarly to the colorimetric camera 20 of the first embodiment, and passes through the filter unit 31 twice to form a two-dimensional image. The RGB value of the patch 200 is obtained by the light incident on the sensor 28. Further, the colorimetric camera 20A of the second embodiment generates a correction matrix using the RGB values of each reference patch of the reference chart unit 41 captured by the two-dimensional image sensor 28 together with the patch 200, and uses this correction matrix. The colorimetric values of the patch 200 are calculated from the RGB values of the patch 200 corrected in this way. Accordingly, the colorimetric camera 20A of the second embodiment can accurately measure the color of the patch 200 in a wide color gamut that matches the spectral sensitivity of the human eye, as with the colorimetric camera 20 of the first embodiment. In addition, it is possible to reduce the influence of the light amount change of the illumination light source 30 and the sensitivity change of the two-dimensional image sensor 28 over time, and to perform highly accurate color measurement.

<Third Embodiment>
Next, the colorimetric camera 20 of the third embodiment will be described. Hereinafter, the colorimetric camera 20 of the third embodiment is referred to as a colorimetric camera 20B, and is distinguished from the colorimetric cameras 20 and 20A of the other embodiments. Further, in the colorimetric camera 20B of the third embodiment, the same reference numerals are given to the same components as those of the colorimetric cameras 20 and 20A of the other embodiments, and a duplicate description will be omitted as appropriate. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  FIG. 16 is a plan view of the bottom surface 23a of the housing 23 of the colorimetric camera 20B according to the third embodiment when viewed in the same direction as FIG. 4-4. FIG. 17 is a perspective view of the dust-proof glass 26 on which the filter unit 50 is formed.

  A colorimetric camera 20B according to the third embodiment includes a filter unit 50 instead of the filter unit 31 described in the first embodiment. As shown in FIGS. 16 and 17, the filter unit 50 includes four filter regions obtained by adding a filter region 50K corresponding to metallic black to three filter regions 50R, 50G, and 50B corresponding to three colors of RGB. Have. In order to prevent illumination light from the illumination light source 30 from overlapping between the four filter regions 50R, 50G, 50B, and 50K between the filter regions 50R, 50G, 50B, and 50K, the first embodiment. A mask portion 32 similar to the filter portion 31 is provided.

  In the filter unit 50, the three filter regions 50R, 50G, and 50B corresponding to the three colors RGB are square roots of the spectral transmittance having a linear conversion relationship with the color matching function, as in the filter unit 31 of the first embodiment. Spectral transmittance. The filter region 50K corresponding to metallic black has a spectral transmittance that is the square root of the spectral transmittance for extracting metallic black.

  FIG. 18 is a diagram illustrating the spectral transmittance of the filter region 50K of the filter unit 50. The broken line graph in the figure indicates the spectral transmittance for extracting metallic black. A filter for extracting metallic black is disclosed in, for example, Japanese Patent Application Laid-Open No. 2010-122080. The solid line graph in FIG. 18 represents the spectral transmittance of the metallic black extraction filter disclosed in Japanese Patent Application Laid-Open No. 2010-122080. The solid line graph in the figure shows the spectral transmittance of the square root of the spectral transmittance for extracting metallic black. In addition, the vertical axis | shaft of a figure is a relative value, and the peak value does not need to be 1.

  The colorimetric camera 20B of the third embodiment is shown by three filter regions 50R, 50G, 50B having a spectral transmittance of the square root of the spectral transmittance having a linear conversion relationship with the color matching function, and a solid line graph of FIG. As shown, a filter unit 50 having a filter region 50K having a spectral transmittance of the square root of the spectral transmittance for extracting metallic black is provided. Similar to the filter unit 31 of the first embodiment, the filter unit 50 is formed on, for example, a dust-proof glass 26 that covers the opening 25 of the housing 23, and travels from the illumination light source 30 to the patch 200 that is a color-measurement body. The illumination light and the reflected light that is reflected by the patch 200 and travels toward the two-dimensional image sensor 28 of the sensor unit 27 are disposed at a position where they pass.

  In the colorimetric camera 20B of the third embodiment, four types of light obtained by the light having passed through the filter unit 50 having the four filter regions 50R, 50G, 50B, and 50K entering the two-dimensional image sensor 28 twice. Based on the signal value, the spectral reflectance of the patch 200 that is a color object to be measured is calculated using a known Wiener estimation method or the like. Then, the colorimetric value of the patch 200 is calculated from the spectral reflectance of the patch 200.

  FIG. 19 is a control block diagram of the colorimetric camera 20B of the third embodiment. As shown in FIG. 19, the colorimetric camera 20B of the third embodiment further includes a spectral estimation calculation unit 51 in addition to the configuration of the colorimetric camera 20 of the first embodiment. The colorimetric camera 20B of the third embodiment includes a colorimetric calculation unit 52 instead of the colorimetry calculation unit 39 provided in the colorimetric camera 20 of the first embodiment.

  In the colorimetric camera 20 </ b> B of the third embodiment, the region selection unit 37 is a region in which the filter illumination effective range corresponding to the filter regions 50 </ b> R, 50 </ b> G, 50 </ b> B of the filter unit 50 is reflected from the image data stored in the frame memory 36. In addition to the pixel data, the pixel data of the region showing the filter illumination effective range corresponding to the filter region 50K is also selected. Then, the averaging processing unit 38 averages the pixel data of the four regions selected by the region selecting unit 37 to obtain four types of signal values including the RGB value and the K value of the patch 200.

  The spectral estimation calculation unit 51 uses the Wiener estimation method or the like based on the four types of signal values of the patch 200 obtained by the processing by the averaging processing unit 38 to spectral reflection of the reflected light incident on the two-dimensional image sensor 28. Estimate the rate.

  FIG. 20 is a diagram for explaining processing by the spectral estimation calculation unit 51, and shows an example of a calculation result obtained by estimating the spectrum of the Japan color chart No2. The broken line graph in FIG. 20A is the spectral distribution of the LED used as the illumination light source 30. In the present embodiment, a high color rendering LED is used as the illumination light source 30. The dashed-dotted line graph in FIG. 20A shows the spectral reflectance of the Japan color chart No2 imaged as a subject by the two-dimensional image sensor 28, and the solid line graph in FIG. The spectral distribution of the reflected light calculated by multiplying the one-dot chain line graph is shown.

  When the spectral distribution shown by the solid line graph in FIG. 20A passes through the four filter regions 50R, 50G, 50B, and 50K of the filter unit 50 twice, the spectral characteristics as shown in FIG. R, G, B, and K in the figure correspond to the four filter regions 50R, 50G, 50B, and 50K of the filter unit 50, respectively, and from the image data output from the two-dimensional image sensor 28, respectively. As a result of integration, four signal values are obtained. In this example, [0.155 0.072 0.203 0.303] is obtained as each integral value.

  FIG. 20 (c) shows the result of calculating the spectral reflectance by the Wiener estimation method using the above-mentioned integral value, the spectral reflectance of all patches in Japan color, and the covariance matrix obtained from the emission spectrum of the light source. Is shown. In the figure, the solid line graph is the estimated spectral reflectance value, the broken line graph is the true value, and the alternate long and short dash line graph is the error between the estimated value and the true value. From the result shown in FIG. 20C, it is understood that the spectral reflectance can be accurately estimated by the processing by the spectral estimation calculation unit 51. Further, the color difference between the colorimetric value obtained from the true spectral reflectance and the colorimetric value obtained from the estimated spectral reflectance is ΔE = 0.47, and a good value can be obtained as the colorimetric value. .

  The colorimetric calculation unit 52 uses the spectral distribution (relative spectral distribution) of the illumination light source 30 stored in advance in the nonvolatile memory 40 to estimate the spectral reflectance of the reflected light obtained by the processing by the spectral estimation calculation unit 51. By dividing, the spectral reflectance of the patch 200 that is the color object to be measured is obtained. Further, the colorimetric calculation unit 52 calculates a colorimetric value of the patch 200 when illuminated by a specific illumination light source 30 (for example, a D50 light source) from the spectral reflectance of the patch 200 by a known method. The spectral reflectance and colorimetric value of the patch 200 calculated by the colorimetric calculation unit 52 are sent to the CPU 101.

  As described above, the colorimetric camera 20B according to the third embodiment includes the filter unit 50 having the four filter regions 50R, 50G, 50B, and 50K, and passes through the filter unit 50 twice to provide a two-dimensional image sensor. The four kinds of signal values of the RGB value and the K value of the patch 200 are obtained by the reflected light incident on 28. Then, the spectral reflectance of the reflected light is estimated using these four kinds of signal values, and the estimated value of the spectral reflectance of the reflected light and the spectral distribution of the illumination light source 30 stored in advance in the nonvolatile memory 40 are obtained. The spectral reflectance of the patch 200 illuminated by the illumination light source 30 is obtained. Further, the colorimetric value of the patch 200 is calculated from the spectral reflectance of the patch 200. Therefore, the colorimetric camera 20B of the present embodiment can accurately measure the color of the patch 200 in a wide color gamut that matches the spectral sensitivity of the human eye, like the colorimetric camera 20 of the first embodiment. At the same time, the spectral reflectance of the patch 200 can also be obtained.

<Fourth embodiment>
Next, the colorimetric camera 20 of the fourth embodiment will be described. Hereinafter, the colorimetric camera 20 of the fourth embodiment is referred to as a colorimetric camera 20C, and is distinguished from the colorimetric cameras 20, 20A, 20B of the other embodiments. Further, in the colorimetric camera 20C of the fourth embodiment, the same reference numerals are given to the same components as those of the colorimetric cameras 20, 20A, and 20B of the other embodiments, and a duplicate description will be omitted as appropriate. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  A colorimetric camera 20C according to the fourth embodiment includes a reference chart unit 41 inside the housing 23, similarly to the colorimetric camera 42A according to the second embodiment. Further, the colorimetric camera 20C of the fourth embodiment includes a filter unit 50 having four filter regions 50R, 50G, 50B, and 50K, similarly to the colorimetric camera 42B of the third embodiment. The filter unit 50 is located at a position where both the illumination light directed from the illumination light source 30 toward the patch 200 that is the color object to be measured and the reflected light reflected by the patch 200 and directed toward the two-dimensional image sensor 28 of the sensor unit 27 pass. Has been placed.

  FIG. 21 is a plan view of the bottom surface 23a of the housing 23 of the colorimetric camera 20C according to the fourth embodiment when viewed in the same direction as FIG. 4-4. In the color measurement camera 20C of the fourth embodiment, as shown in FIG. 21, a reference chart portion 41 is disposed on the bottom surface portion 23a of the housing 23 so as to be adjacent to the opening 25. In addition, a filter unit 50 having four filter regions 50R, 50G, 50B, and 50K is formed on the optical path length changing member 42 covering the opening 25.

  FIG. 22 is a control block diagram of the colorimetric camera 20C of the fourth embodiment. As shown in FIG. 22, the colorimetric camera 20C of the fourth embodiment has the same patch area selection unit 45 as the colorimetric camera 20A of the second embodiment in addition to the configuration of the colorimetric camera 20 of the first embodiment. Further, an averaging processing unit 46, a correction matrix generation unit 47, and a correction calculation unit 49 are further provided. In addition, the colorimetric camera 20C of the present embodiment includes a spectral estimation calculation unit 51 and a colorimetry calculation unit 52 similar to those of the colorimetric camera 20B of the third embodiment.

  In the colorimetric camera 20C according to the fourth embodiment, as with the colorimetric camera 20A according to the second embodiment, during the colorimetry of the patch 200, the correction matrix generation unit 47 causes a change in the light amount of the illumination light source 30 over time or two-dimensionality. A correction matrix for reducing the influence of the sensitivity change of the image sensor 28 is generated. Then, the correction calculation unit 49 corrects the signal value obtained by the processing in the averaging processing unit 38 using this correction matrix. Here, in the colorimetric camera 20C of this embodiment, the signal value to be corrected by the correction calculation unit 49 is the RGB value and K value of the patch 200, as in the colorimetric camera 20B of the third embodiment. These are four types of signal values.

  Thereafter, the colorimetric camera 20C according to the fourth embodiment uses the corrected four types of signal values to the two-dimensional image sensor 28 by the spectral estimation calculation unit 51 similarly to the colorimetric camera 20B according to the third embodiment. Based on the estimated spectral reflectance of the reflected light of the incident patch 200 and the spectral distribution (relative spectral distribution) of the illumination light source 30 stored in advance in the nonvolatile memory 40, The colorimetric calculation unit 52 calculates the spectral reflectance and colorimetric value of the patch 200.

  As described above, the colorimetric camera 20C according to the fourth embodiment includes the same filter unit 50 as the colorimetric camera 20B according to the third embodiment, and passes through the filter unit 50 twice to thereby obtain the two-dimensional image sensor 28. The four types of signal values of the patch 200 are obtained from the reflected light incident on the patch 200, and the spectral reflectance and the colorimetric value of the patch 200 are calculated using these four types of signal values. Therefore, the colorimetric camera 20C of the fourth embodiment can accurately measure the color of the patch 200 in a wide color gamut that matches the spectral sensitivity of the human eye, as with the colorimetric camera 20B of the third embodiment. In addition, the spectral reflectance of the patch 200 can be obtained.

  In addition, the colorimetric camera 20C of the fourth embodiment generates a correction matrix using the RGB values of each reference patch of the reference chart unit 41 as in the colorimetric camera 20A of the second embodiment, and this correction matrix is The spectral reflectance and the colorimetric value of the patch 200 are calculated from the signal values corrected by using them. Therefore, the colorimetric camera 20C of the fourth embodiment reduces the influence of the change in the light amount of the illumination light source 30 and the change in sensitivity of the two-dimensional image sensor 28 over time, like the colorimetric camera 20A of the second embodiment. Therefore, highly accurate color measurement can be performed.

<Fifth Embodiment>
Next, the colorimetric camera 20 of the fifth embodiment will be described. Hereinafter, the colorimetric camera 20 of the fifth embodiment is referred to as a colorimetric camera 20D, and is distinguished from the colorimetric cameras 20, 20A, 20B, and 20C of other embodiments. Further, in the colorimetric camera 20D of the fifth embodiment, the same reference numerals are given to the same components as those of the colorimetric cameras 20, 20A, 20B, and 20C of the other embodiments, and the repeated description is omitted as appropriate. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  The mechanical configuration of the colorimetric camera 20D of the fifth embodiment is the same as that of the colorimetric camera 20C of the fourth embodiment. However, in the colorimetric camera 20C of the fourth embodiment, a correction matrix is generated based on the RGB values of each reference patch 200 of the reference chart unit 41 and the initial reference RGB values stored in advance in the nonvolatile memory 40. In contrast to correcting the signal value of the patch 200, the colorimetric camera 20D of the fifth embodiment corrects the light quantity of the illumination light source 30.

  FIG. 23 is a control block diagram of the colorimetric camera 20D of the fifth embodiment. As shown in FIG. 23, the colorimetric camera 20D according to the fifth embodiment includes a light source correction amount calculation unit 53 instead of the correction matrix generation unit 47 and the correction calculation unit 49 included in the colorimetry camera 20C according to the fourth embodiment. Prepare. Other configurations are the same as those of the colorimetric camera 20C of the fourth embodiment.

  The light source correction amount calculation unit 53 calculates the time based on the difference between the RGB value of each reference patch of the reference chart unit 41 input from the averaging processing unit 46 and the initial reference RGB value stored in advance in the nonvolatile memory 40. A correction amount of the light emission intensity (light quantity of illumination light) of the illumination light source 30 for canceling the light quantity fluctuation of the illumination light source 30 with the progress is calculated. That is, the light source correction amount calculation unit 53 calculates the correction amount of the light emission intensity of the illumination light source 30 so that the patch 200 is illuminated by the illumination light source 30 whose light emission intensity is corrected in the same manner as in the initial processing. The correction amount calculated by the light source correction amount calculation unit 53 is fed back to the light source control unit 34, and the light emission intensity of the illumination light source 30 is corrected.

  In the colorimetric camera 20D of the fifth embodiment, the RGB of the patch 200 is processed from the image data captured under illumination of the illumination light source 30 whose emission intensity has been corrected, by processing in the region selection unit 37 and the averaging processing unit 38. Four types of signal values including values and K values are obtained. Then, the colorimetric camera 20D according to the fifth embodiment uses the four kinds of signal values to enter the two-dimensional image sensor 28 by the spectral estimation calculation unit 51, similarly to the colorimetric camera 20B according to the third embodiment. The spectral reflectance of the reflected light of the patch 200 is estimated, and colorimetry is performed based on the estimated value of the spectral reflectance and the spectral distribution (relative spectral distribution) of the illumination light source 30 stored in advance in the nonvolatile memory 40. The calculation unit 52 calculates the spectral reflectance and colorimetric value of the patch 200.

  As described above, the colorimetric camera 20D according to the fifth embodiment includes the filter unit 50 similar to the colorimetric camera 20B according to the third embodiment, passes through the filter unit 50 twice, and the two-dimensional image sensor 28. The four types of signal values of the patch 200 are obtained from the reflected light incident on the patch 200, and the spectral reflectance and the colorimetric value of the patch 200 are calculated using these four types of signal values. Therefore, the colorimetric camera 20C of the fifth embodiment can accurately measure the color of the patch 200 in a wide color gamut that matches the spectral sensitivity of the human eye, as with the colorimetric camera 20B of the third embodiment. In addition, the spectral reflectance of the patch 200 can be obtained.

  In addition, the colorimetric camera 20D of the fifth embodiment has the light emission intensity of the illumination light source 30 based on the difference between the RGB value of each reference patch of the reference chart unit 41 and the initial reference RGB value stored in advance in the nonvolatile memory 40. I am trying to correct. Therefore, the colorimetric camera 20D of the fifth embodiment mitigates the influence caused by the change in the amount of light of the illumination light source 30 over time, like the colorimetric camera 20A of the second embodiment and the colorimetric camera 20C of the fourth embodiment. Thus, highly accurate color measurement can be performed.

<Sixth Embodiment>
Next, the colorimetric camera 20 of the sixth embodiment will be described. Hereinafter, the colorimetric camera 20 of the sixth embodiment is referred to as a colorimetric camera 20E, and is distinguished from the colorimetric cameras 20, 20A, 20B, 20C, and 20D of the other embodiments. Further, in the colorimetric camera 20E of the sixth embodiment, the same reference numerals are given to the same components as those of the colorimetric cameras 20, 20A, 20B, 20C, and 20D of the other embodiments, and a duplicate description is omitted as appropriate. To do. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  FIG. 24-1 is a longitudinal cross-sectional view (cross-sectional view at the same position as FIG. 4-1) of the colorimetric camera 20E of the sixth embodiment, and FIG. 24-2 shows the colorimetric camera 20E of the sixth embodiment. It is the top view which looked at the bottom face part of case 23 in the same direction as Drawing 4-4.

  The colorimetric camera 20E of the sixth embodiment includes a reference chart unit 55 inside the housing 23, as shown in FIGS. 24-1 and 24-2. The reference chart unit 55 has a white reference that is a reference white region, in addition to having a reference patch in the same manner as the reference chart unit 41 of the second embodiment. The white reference spectral reflectance of the reference chart unit 55 is stored in the nonvolatile memory 40 in advance.

  In addition, the colorimetric camera 20E of the sixth embodiment includes two filter units 50a and 50b. These filter units 50a and 50b have the same configuration as the filter unit 50 of the third embodiment. Similarly to the filter unit 50 of the third embodiment, the filter unit 50 a is formed on the optical path length changing member 42, for example, and is reflected by the patch 200 with illumination light directed from the illumination light source 30 toward the patch 200 that is a color object to be measured. Thus, the sensor unit 27 is disposed at a position where both the reflected light toward the two-dimensional image sensor 28 passes. The filter unit 50b is arranged at a position where both the illumination light directed from the illumination light source 30 toward the white reference of the reference chart unit 55 and the reflected light reflected by the white reference and directed toward the two-dimensional image sensor 28 of the sensor unit 27 pass. Has been.

  Specifically, in the colorimetric camera 20E according to the sixth embodiment, the support glass 56 having high transparency is disposed at a position facing the region where the white reference is formed in the reference chart portion 55. The filter unit 50 b is formed by depositing a multilayer film on the support glass 56. Moreover, the filter part 50b may be formed separately from the support glass 56, and may be attached to the support glass 56 by adhesion | attachment etc. The filter unit 50 b is a two-dimensional image sensor of the sensor unit 27 that is reflected from the illumination light source 30 toward the white reference of the reference chart unit 55 and the white reference when the support glass 56 is attached to the housing 23. It is formed on the support glass 56 so as to be disposed at a position where both the reflected light directed to 28 passes. The filter unit 50b is a position through which both the illumination light directed from the illumination light source 30 toward the white reference of the reference chart unit 55 and the reflected light reflected by the white reference toward the two-dimensional image sensor 28 of the sensor unit 27 pass. It does not necessarily have to be formed on the support glass 56.

  FIG. 25 is a control block diagram of the colorimetric camera 20E of the sixth embodiment. As shown in FIG. 25, the colorimetric camera 20E according to the sixth embodiment includes a white reference region selection unit 60, an averaging processing unit 61, and a spectral estimation calculation unit in addition to the configuration of the colorimetric camera 20C according to the fourth embodiment. 62, and a light source spectral estimation calculation unit 63.

  The white reference area selection unit 60 outputs pixel data of an area showing the white reference of the reference chart unit 55 from the image data output from the two-dimensional image sensor 28 and stored in the frame memory 36 through the filter unit 50b. Selection is made for each of the four filter regions of the filter unit 50b. Since the white reference position corresponding to each filter region of the filter unit 50b is fixed with respect to the two-dimensional image sensor 28, the white reference region selection unit 60 selects pixel data at a predetermined position of the image data. do it.

  The averaging processing unit 61 averages the white reference pixel data for each of the four filter regions selected by the white reference region selection unit 60, and generates four types of signals including the white reference RGB value and the K value. obtain.

  Based on the four types of white reference signal values of the reference chart unit 55 obtained by the processing by the averaging processing unit 61, the spectral estimation calculation unit 62 is reflected on the white reference using a Wiener estimation method or the like and is two-dimensionally reflected. The spectral reflectance of the reflected light incident on the image sensor 28 is estimated.

  The light source spectral estimation calculation unit 63 divides the estimated value of the spectral reflectance of the reflected light obtained by the processing by the spectral estimation calculation unit 62 by the white reference spectral reflectance stored in advance in the nonvolatile memory 40. The spectral distribution of the illumination light source 30 is obtained. The spectral distribution of the illumination light source 30 obtained by the light source spectral estimation calculation unit 63 is sent to the colorimetry calculation unit 52.

  In the colorimetric camera 20E of the sixth embodiment, the region selection unit 37, the averaging processing unit 38, the patch region selection unit 45, the averaging processing unit 46, the correction matrix generation unit 47, the correction calculation unit 49, and the spectral estimation calculation unit 51. The processing by the colorimetric calculation unit 52 is basically the same as that of the colorimetric camera 20C of the fourth embodiment. However, in the colorimetric camera 20 </ b> C of the fourth embodiment, the colorimetric calculation unit 52 has the spectral reflectance estimated value obtained by the processing by the spectral estimation calculation unit 51 and the illumination stored in advance in the nonvolatile memory 40. Based on the spectral distribution of the light source 30, the spectral reflectance and colorimetric value of the patch 200 are calculated. On the other hand, in the colorimetric camera 20E of the sixth embodiment, the colorimetric calculation unit 52 uses the spectral reflectance estimation value obtained by the processing by the spectral estimation calculation unit 51 and the processing by the light source spectral estimation calculation unit 63. Based on the spectral distribution of the illumination light source 30 obtained by the above, the spectral reflectance and colorimetric value of the patch 200 are calculated.

  As described above, the colorimetric camera 20E of the sixth embodiment estimates the spectral distribution of the illumination light source 30 in real time and calculates the spectral reflectance and colorimetric value of the patch 200 using the estimated values. Therefore, the spectral reflectance and the colorimetric value of the patch 200 can be calculated with high accuracy regardless of the spectral fluctuation of the illumination light source 30.

<Seventh embodiment>
Next, the colorimetric camera 20 of the seventh embodiment will be described. Hereinafter, the colorimetric camera 20 of the seventh embodiment is referred to as a colorimetric camera 20F, and is distinguished from the colorimetric cameras 20, 20A, 20B, 20C, 20D, and 20E of other embodiments. Further, in the colorimetric camera 20F of the seventh embodiment, the same components as those of the colorimetric cameras 20, 20A, 20B, 20C, 20D, and 20E of the other embodiments are denoted by the same reference numerals, and redundant description is given. Omitted where appropriate. Note that the configuration of the image forming apparatus 100 is the same as that of the first embodiment, and a description thereof will be omitted.

  FIG. 26A is a longitudinal sectional view of the colorimetric camera 20F according to the seventh embodiment (cross-sectional view at the same position as FIG. 4A), and FIG. 26B is a schematic view of the colorimetric camera 20F according to the seventh embodiment. It is the top view which looked at the bottom face part of a housing | casing in the same direction as FIGS. 4-4.

  The mechanical configuration of the colorimetric camera 20F of the seventh embodiment is basically the same as that of the colorimetric camera 20E of the sixth embodiment. However, in the colorimetric camera 20F of the seventh embodiment, as shown in FIGS. 26A and 26B, an external light window 65 is formed on the side surface portion of the frame body 21 constituting the housing 23. Then, when estimating the spectral distribution of the external light, the external light window 65 is opened so that the external light is taken into the housing 23.

  In the colorimetric camera 20F of the seventh embodiment, the external light window 65 is blocked by the shutter member 66 during the colorimetric measurement of the patch 200, so that the external light does not enter the housing 23. The shutter member 66 is driven by an outside light window opening / closing mechanism described later, and moves to either a position where the outside light window 65 is opened or a position where the outside light window 65 is closed.

  FIG. 27 is a control block diagram of the colorimetric camera 20F of the seventh embodiment. As shown in FIG. 27, the color measurement camera 20F according to the seventh embodiment further includes an external light window opening / closing mechanism 67 and an external light window opening / closing control unit 68 in addition to the configuration of the color measurement camera 20E according to the sixth embodiment. .

  As described above, the outside light window opening / closing mechanism 67 is a mechanism for driving the shutter member 66 that opens and closes the outside light window 65.

  The outside light window opening / closing control unit 68 outputs a control signal for turning off the illumination light source 30 to the light source control unit 34 when estimating the spectral distribution of outside light, and also to the outside light window opening / closing mechanism 67. A control signal for moving the shutter member 66 to a position where the outside light window 65 is opened is output.

  In the colorimetric camera 20F of the seventh embodiment, when the spectral distribution of external light is estimated, the illumination light source 30 is turned off and the external light is inside the housing 23 by the control by the external light window opening / closing control unit 68. It is taken in and the inside of the housing | casing 23 is made into the state illuminated only with external light. In this state, the image data output from the two-dimensional image sensor 28 is stored in the frame memory 36, and the white reference region selection unit 60, the averaging processing unit 61, the spectral estimation calculation unit 62, and the light source spectral estimation calculation unit 63 are stored. However, by performing the same processing as that of the colorimetric camera 20E of the sixth embodiment, the spectral distribution of external light can be obtained. The spectral distribution of external light obtained by the colorimetric camera 20F of the seventh embodiment is output to, for example, the CPU 101 or an external device.

  As described above, according to the colorimetric camera 20F of the seventh embodiment, the colorimetric values and spectral reflectance of the patch 200 can be calculated with high accuracy in a wide color gamut that matches the spectral sensitivity of the human eye. Thus, the spectral distribution of external light can also be obtained. The spectral distribution of external light is, for example, an application for simulating a difference in color appearance on a computer due to a difference between an external light condition at the time of printing by the image forming apparatus 100 and an external light condition in an environment where the printed material is actually used. Can be used.

  Although specific embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments as they are, and is embodied in the implementation stage while adding various modifications without departing from the scope of the invention. be able to. For example, in the above-described embodiment, the calculation for calculating the colorimetric value of the patch 200 is performed inside the colorimetric camera 20, but the calculation for calculating the colorimetric value of the patch 200 is, for example, It may be performed outside the colorimetric camera 20, such as the CPU 101 of the image forming apparatus 100.

20, 20A, 20B, 20C, 20D, 20E, 20F Colorimetric camera 28 Two-dimensional image sensor 30 Illumination light source 31 Filter unit 31R, 31G, 31B Filter region 39 Colorimetric calculation unit 41 Reference chart unit 47 Correction matrix generation unit 49 Correction Calculation unit 50 Filter unit 50R, 50G, 50B, 50K Filter area 51 Spectral estimation calculation unit 52 Colorimetry calculation unit 53 Light source correction amount calculation unit 55 Reference chart unit 63 Light source spectral estimation calculation unit 100 Image forming apparatus 101 CPU
104 Recording Head Driver 113 Ink Ejection Control Unit 200 Patch P Recording Medium

JP 2012-63270 A Japanese Patent No. 4797289

Claims (8)

  1. A light source for illuminating the color to be measured;
    A two-dimensional image sensor that images the color object to be measured illuminated by the light source;
    A filter unit disposed at a position through which both the illumination light directed from the light source toward the color object to be measured and the reflected light reflected by the color object to be measured and directed to the two-dimensional image sensor pass;
    A calculation unit that calculates a colorimetric value of the color object to be measured based on image data of the color object to be measured output from the two-dimensional image sensor;
    The color filter according to claim 1, wherein the filter unit has at least three filter regions, and the three filter regions have a spectral transmittance of a square root of a spectral transmittance having a linear transformation relationship with a color matching function.
  2. A reference chart portion that is illuminated by the light source together with the color to be measured and imaged by the two-dimensional image sensor;
    A generating unit that generates correction data based on a difference between the image data of the reference chart unit output from the two-dimensional image sensor and the image data of the reference chart unit stored in advance;
    A correction unit that corrects the image data of the color object to be measured based on the correction data;
    The colorimetric apparatus according to claim 1, wherein the calculation unit calculates a colorimetric value of the color object to be measured based on the corrected image data of the color object to be measured.
  3. The filter unit has four filter regions, three of which have a spectral transmittance of the square root of the spectral transmittance having a linear transformation relationship with the color matching function, and the remaining one filter region is , Having a spectral transmittance of the square root of the spectral transmittance for extracting metallic black,
    A first estimation unit for estimating a spectral distribution of reflected light from the color object to be measured based on image data of the color object to be measured output from the two-dimensional image sensor;
    The calculation unit calculates a spectral reflectance of the color object to be measured based on the estimated spectral distribution of reflected light from the color object to be measured and the spectral distribution of the light source, and based on the calculated spectral reflectance The colorimetric device according to claim 1, wherein a colorimetric value of the colorimetric object is calculated.
  4. A reference chart portion that is illuminated by the light source together with the color to be measured and imaged by the two-dimensional image sensor;
    A generating unit that generates correction data based on a difference between the image data of the reference chart unit output from the two-dimensional image sensor and the image data of the reference chart unit stored in advance;
    A correction unit that corrects the image data of the color object to be measured based on the correction data;
    The colorimetric measurement according to claim 3, wherein the first estimation unit estimates a spectral distribution of reflected light from the color measurement object based on the corrected image data of the color measurement object. apparatus.
  5. A reference chart portion that is illuminated by the light source together with the color to be measured and imaged by the two-dimensional image sensor;
    A correction unit that corrects the light emission intensity of the light source based on a difference between the image data of the reference chart unit output from the two-dimensional image sensor and the image data of the reference chart unit stored in advance; ,
    The first estimating unit estimates a spectral distribution of reflected light from the measured color object based on image data of the measured color object illuminated by the light source whose emission intensity is corrected. The colorimetric apparatus according to claim 3.
  6. A white reference part having a known spectral reflectance, which is illuminated by the light source together with the color to be measured and imaged by the two-dimensional image sensor;
    A second estimation unit for estimating a spectral distribution of the light source based on the image data of the white reference unit output from the two-dimensional image sensor and the spectral reflectance of the white reference unit;
    The calculation unit calculates a spectral reflectance of the color object to be measured based on the estimated spectral distribution of reflected light from the color object to be measured and the estimated spectral distribution of the light source. The color measuring device according to claim 3.
  7. A white reference part with a known spectral reflectance that is illuminated with external light and imaged by the two-dimensional image sensor;
    A third estimation unit configured to estimate the spectral distribution of the external light based on the image data of the white reference unit output from the two-dimensional image sensor and the spectral reflectance of the white reference unit; The color measuring device according to claim 3, wherein:
  8. A colorimetric device according to any one of claims 1 to 7,
    An image forming unit that forms an image on a recording medium,
    The image forming apparatus, wherein the measured color object is an image formed by the image forming unit.
JP2013052540A 2013-03-14 2013-03-14 Color measuring device and image forming apparatus Active JP6094286B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013052540A JP6094286B2 (en) 2013-03-14 2013-03-14 Color measuring device and image forming apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013052540A JP6094286B2 (en) 2013-03-14 2013-03-14 Color measuring device and image forming apparatus

Publications (2)

Publication Number Publication Date
JP2014178211A true JP2014178211A (en) 2014-09-25
JP6094286B2 JP6094286B2 (en) 2017-03-15

Family

ID=51698315

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013052540A Active JP6094286B2 (en) 2013-03-14 2013-03-14 Color measuring device and image forming apparatus

Country Status (1)

Country Link
JP (1) JP6094286B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3112827A1 (en) * 2015-06-02 2017-01-04 X-Rite Switzerland GmbH Sample target for improved accuracy of color measurements and color measurements using the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10170716A (en) * 1996-12-13 1998-06-26 Kyodo Printing Co Ltd Color filter and reflection type color liquid crystal display device, and evaluating method therefor
JP2000131684A (en) * 1998-10-22 2000-05-12 Toshiba Corp Liquid crystal display element
JP2010122080A (en) * 2008-11-20 2010-06-03 Hiroaki Kodera Method and device for estimating spectral image
JP2010190672A (en) * 2009-02-17 2010-09-02 Ricoh Co Ltd Image evaluating apparatus and method, and image forming apparatus
JP2012063270A (en) * 2010-09-16 2012-03-29 Ricoh Co Ltd Imaging device and recording apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10170716A (en) * 1996-12-13 1998-06-26 Kyodo Printing Co Ltd Color filter and reflection type color liquid crystal display device, and evaluating method therefor
JP2000131684A (en) * 1998-10-22 2000-05-12 Toshiba Corp Liquid crystal display element
JP2010122080A (en) * 2008-11-20 2010-06-03 Hiroaki Kodera Method and device for estimating spectral image
JP2010190672A (en) * 2009-02-17 2010-09-02 Ricoh Co Ltd Image evaluating apparatus and method, and image forming apparatus
JP2012063270A (en) * 2010-09-16 2012-03-29 Ricoh Co Ltd Imaging device and recording apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3112827A1 (en) * 2015-06-02 2017-01-04 X-Rite Switzerland GmbH Sample target for improved accuracy of color measurements and color measurements using the same
US9823131B2 (en) 2015-06-02 2017-11-21 X-Rite Switzerland GmbH Sample target for improved accuracy of color measurements and color measurements using the same

Also Published As

Publication number Publication date
JP6094286B2 (en) 2017-03-15

Similar Documents

Publication Publication Date Title
US7433096B2 (en) Scanning device calibration system and method
EP1457335A1 (en) Control system for a printing press
JP3749552B2 (en) Method and apparatus for printing an image on a substrate and method for printing a color image on a substrate
US20020054292A1 (en) Process and apparatus for the colorimetric measurement of a two-dimensional original
US7315394B2 (en) Calibration method for an imaging device
JP2008518218A (en) Measuring device and scanning device for photoelectrically measuring a measurement object based on pixels
US10282644B2 (en) Remote adjustment of print settings
US7286261B2 (en) Color calibration color value correction
US7602532B2 (en) Highly accurate and rapid scanning by a simple scanning device and color correction technology for a printing device
JP5244952B2 (en) Image sensor unit and image reading apparatus
CN103297643B (en) Image capturing unit, colour measuring device, image processing system, color measurement systems and color measurement method
US9129196B2 (en) Image capturing device and recording apparatus
JP2012063270A (en) Imaging device and recording apparatus
US20060227397A1 (en) Gradation conversion calibration method and gradation conversion calibration module using the same
JP5715229B2 (en) A device for measuring color, comprising two measuring devices that operate differently
JP5887998B2 (en) Color measuring device, recording device, color measuring method and program
US7692832B2 (en) Method for correcting scanner non-uniformity
US8743433B2 (en) Color measuring device, image forming apparatus and computer program product
US8982408B2 (en) Color image capturing, measuring, and formation using capture unit with specular reflection preventing member
US8902466B2 (en) Color measuring device, image forming apparatus, image forming method, and computer-readable storage medium
US9118874B2 (en) Image capturing device, color measuring device, color measuring system, image forming apparatus, and color measuring method
US8817329B2 (en) Color measuring device, image forming apparatus, color measuring method, and color measuring system
US9070052B2 (en) Image capturing unit, color measuring device, image forming device, color measuring system, and color measuring method
US8947731B2 (en) Imaging unit, color measuring device, image forming apparatus, color measuring system, and color measuring method
US9270836B2 (en) Color measurement system to correct color data corresponding to a ratio of detected distance to a reference distance

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160212

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20161227

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170117

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170130

R151 Written notification of patent or utility model registration

Ref document number: 6094286

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151