EP2230830B1 - Image processing apparatus and control method of image forming apparatus - Google Patents
Image processing apparatus and control method of image forming apparatus Download PDFInfo
- Publication number
- EP2230830B1 EP2230830B1 EP10156958.0A EP10156958A EP2230830B1 EP 2230830 B1 EP2230830 B1 EP 2230830B1 EP 10156958 A EP10156958 A EP 10156958A EP 2230830 B1 EP2230830 B1 EP 2230830B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- trapping
- processing
- pixel
- edge smoothing
- density
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012545 processing Methods 0.000 title claims description 183
- 238000000034 method Methods 0.000 title claims description 82
- 238000009499 grossing Methods 0.000 claims description 152
- 239000003086 colorant Substances 0.000 claims description 14
- 238000004088 simulation Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims 1
- 230000008569 process Effects 0.000 description 70
- 230000006870 function Effects 0.000 description 16
- 238000012546 transfer Methods 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000003705 background correction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/58—Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
Definitions
- the present invention relates to an image processing apparatus, which is used to execute, for example, edge processing of an image object, and a control method of an image forming apparatus.
- Some electrophotographic image forming apparatuses include a function of executing trapping processing for slightly overlapping the boundaries of objects such as images, graphics, characters, and the like on different color plates, and edge smoothing processing for smoothing the edge of an object after dithering.
- the trapping processing is executed to prevent a blank (called a boundary gap) in the boundaries between objects formed by different color plates, by expanding the edge of at least one of the objects whose boundaries contact each other by a predetermined amount. Then, the objects overlap each other to prevent any boundary gap due to displacement of plates.
- the trapping processing is not executed unconditionally, and it is a common practice to decide whether or not to apply the processing according to the densities of the objects whose boundaries contact each other.
- the edge smoothing processing is executed to smooth jaggies of the edge (step-like edge) due to a screen structure visualized by dithering.
- pixels are interpolated with reference to, for example, an original object so as to smoothly connect pixels of the edge of a target object after dithering.
- US 2008/0055654 discloses an apparatus and method for processing an image.
- image data When image data is input, it is determined whether to perform trapping at the position of the pixel of interest of the input image data, and trapping is performed for the input image data.
- spatial filter processing is performed for the pixel of interest of the input data.
- An output of the trapping and that of the filter processing are weighted, and image data to be output is calculated.
- the present invention has been made in consideration of the aforementioned related arts, and provides an image processing apparatus which can prevent image quality from deteriorating by selectively applying one of trapping and edge smoothing processes to one edge or its portion even when the trapping and edge smoothing processes are applied to one image, and a control method of an image forming apparatus.
- the present invention in its first aspect provides an image processing apparatus as specified in claim 1.
- the present invention in its second aspect provides an image forming method as specified in claim 11.
- the present invention in its third aspect provides a computer program as specified in claim 12.
- a program can be provided by itself or carried by a carrier medium.
- the carrier medium may be a recording or other storage medium.
- the carrier medium may also be a transmission medium.
- the transmission medium may be a signal.
- FIG. 1 is a schematic block diagram of an image forming apparatus according to the first embodiment of the present invention.
- this embodiment assumes a digital multifunction peripheral as an image forming apparatus, the present invention can use other print devices such as a color printer in addition to a copying machine.
- the structure of the image forming apparatus according to this embodiment will be described first.
- an image forming apparatus 800 includes an image reading unit 100, image reception unit 101, UI unit 107, image processing unit 102 which executes various image processes, storage unit 105, CPU 106, and image output unit 103.
- the image forming apparatus 800 can be connected to a server which manages image data, a personal computer (PC) which issues a print execution instruction to this image forming apparatus, and the like via a network such as a LAN or Internet. Also, the image forming apparatus 800 can be connected to an external communication unit 104 via the image reception unit 101.
- the image processing unit 102 can be configured as an independent image processing apparatus having only image processing functions using a computer or the like.
- the image reading unit 100 has, for example, an image scanner, and reads an image from, for example, a paper document. For example, the image reading unit 100 reads an RGB color image. The read RGB data is sent to the image processing unit 102.
- a scanner image processor 1020 applies image processes such as shading correction, block separation processing, and color conversion to the RGB color image data read by the image reading unit 100.
- the image reception unit 101 receives image data described in a page description language (PDL) (PDL image data) via, for example, a communication line.
- PDL image data is a set of commands which describe objects that form an image.
- the image forming apparatus 800 can receive image data expressed by a command group associated with individual objects that form an image and can form an image of that image data, in addition to the PDL image data. A case will be exemplified below wherein the PDL image data is received.
- the PDL image data input to the image reception unit 101 is sent to a printer image processor 1021.
- An interpreter included in the printer image processor 1021 interprets a command group of the PDL image data, and outputs intermediate code data.
- a RIP (raster image processor) included in the printer image processor 1021 rasterizes the intermediate code data to a bitmap image.
- the bitmap image data generated by the RIP has tones before quantization, and is called a continuous tone image or contone image.
- the printer image processor 1021 rasterizes pieces of attribute information (graphic attribute, color attribute, picture attribute, text attribute, thin line attribute, etc.) for respective pixels from attribute information included in the command group. That is, each attribute information indicates a type of an object, and an attribute map indicating the types of objects to each of which a pixel of interest belongs for respective pixels is generated in association with the bitmap image data.
- Images obtained by the scanner image processor 1020 and printer image processor 1021 can be temporarily stored in the storage unit 105 (to be described later) (BOX function).
- BOX function When the user selects a desired image from the stored image group using the UI unit 107, subsequent processes are applied to the selected image, and the processed image is output from the image output unit 103.
- a color processor 1022 accepts data from the image reading unit 100 or image reception unit 101, and executes, for example, RGB ⁇ CMYK color conversion processing. That is, the color processor 1022 converts a color system according to an input image into that of image data, an image of which is formed by the image forming apparatus 800. Furthermore, a trapping processor 1023 applies trapping to CMYK image data, and an image forming processor 1024 then applies dithering. In the trapping and dithering, pixels which belong to a target object are specified with reference to the attribute map, and the processes are applied to the specified pixels. Finally, an edge smoothing processor 1025 executes edge smoothing processing based on contone image data before the dithering and screen image data after the dithering. The contone image data is input from the trapping processor 1023, and the screen image data is input from the image forming processor 1024.
- the storage unit 105 includes various storage media such as a random access memory (RAM), read-only memory (ROM), and hard disk.
- RAM random access memory
- ROM read-only memory
- the RAM is used as an area for storing data and various kinds of information, and also as a work area of the CPU 106.
- the storage unit 105 often includes a nonvolatile rewritable memory.
- this memory includes a trapping setting save area 1051 which saves trapping settings, and an edge smoothing setting save area 1052 which saves edge smoothing settings.
- the trapping settings include conditions required upon application of trapping processing to a pixel of interest.
- the edge smoothing settings include conditions required upon application of edge smoothing processing to a pixel of interest. Note that these save areas may be included in the trapping processor and edge smoothing processor, which refer to these areas.
- the ROM is used as an area for storing various control programs.
- the CPU 106 is used to judge and control various kinds of processes according to the programs stored in the ROM.
- the image output unit 103 has a function of outputting an image (for example, a function of forming an image on a print medium such as a print paper sheet and outputting the image).
- An image deformation processing method for bitmap image data after the RIP processing has been described. Note that the invention of this embodiment can be applied to, for example, the trapping processing which is applied to object data in the RIP. When the trapping processing is applied to object data in the RIP, the printer image processor 1021 executes that processing.
- Fig. 8 is a view illustrating the hardware arrangement of the image reading unit 100, image reception unit 101, and image output unit 103 of the image forming apparatus 800 such as a digital multifunction equipment shown in Fig. 1 , that is, a sectional view of the image forming apparatus.
- a controller 801 includes the image processing unit 102 in Fig. 1 , and also has a function of controlling the overall image forming apparatus.
- the image forming apparatus 800 has copy, printer, and FAX functions.
- the image forming apparatus 800 has a structure for forming an image by overlapping color plates of at least two colors. For this purpose, the image forming apparatus 800 has image forming units 802 for respective color components.
- Each image forming unit 802 forms a toner image developed by a color recording material such as toner for each color component of image data.
- the formed toner images of the respective color components are transferred onto an intermediate transfer belt 803.
- the images of the respective color components i.e., color plates
- the number of color components of an image formed by the image forming units 802 are often two or three depending on apparatuses, and five or more colors may be employed.
- Color image data formed in this way is transferred onto a paper sheet conveyed from each tray 804, and is heated and fixed on the sheet by a fixing device 805. Then, the paper sheet is exhausted on an exhaust tray.
- a trapping ON/OFF determination process determines for image data input from the color processor 1022 whether or not a trapping setting is ON. This setting is input using the UI unit 107, and is saved in the trapping setting save area 1051. As a result of determination in step S201, if the trapping setting is OFF, the trapping processing ends (S208). On the other hand, if the trapping setting is ON, the subsequent processes are executed while sequentially updating a pixel of interest pixel by pixel.
- the process advances to a pixel of interest attribute determination process (S202) first.
- the pixel of interest attribute determination process (S202) determines whether or not the attribute of a pixel of interest is an attribute which is set by the UI unit 107 and is saved in the trapping setting save area 1051.
- the attribute of the pixel of interest can be obtained by acquiring an attribute value associated with that pixel with reference to the attribute map.
- the attribute can also be called a type of an object. If it is determined that the two attributes match, the process advances to a next surrounding pixel attribute determination process (S203).
- the trapping processing is processing which copies a surrounding pixel, which belongs to a neighboring object of an object to which the pixel of interest belongs, to the position of the pixel of interest.
- a surrounding pixel is also called a reference pixel.
- the surrounding pixel attribute determination process (S203) determines whether or not a surrounding pixel of the pixel of interest has the same attribute as that which is set by the UI unit 107 and is saved in the trapping setting save area 1051. If it is determined that the surrounding pixel has the same attribute, the process advances to a next pixel of interest density determination process (S204).
- the pixel of interest density determination process (S204) determines whether or not the density of the pixel of interest falls within a first reference density range, which is set by the UI unit 107 and is saved in the trapping setting save area 1051. If the density of the pixel of interest falls within the first reference density range, the process advances to a next reference pixel density determination process (S205).
- the reference pixel density determination process (S205) determines if the density of the reference pixel falls within a second reference density range which is set by the UI unit 107 and is saved in the trapping setting save area 1051. If the density of the reference pixel falls within the second reference density range, the process advances to a trapping process (S206) to apply trapping to the pixel of interest. That is, only when the pixel of interest satisfies all trapping processing execution conditions, the trapping processing is executed. When any of the execution conditions is not satisfied in the determination processes from the pixel of interest attribute determination process (S202) to the reference pixel density determination process (S205), the process jumps to a pixel update process (S207) without executing the trapping process (S206).
- the pixel of interest is moved to the next pixel if the next pixel is available, and the process returns to the pixel of interest attribute determination process (S202). If no next pixel is available, the processing ends.
- the next pixel can be decided in, for example, a raster order.
- the image forming processor 1024 applies dithering, and the control then transitions to the edge smoothing processor.
- Details of the processing by the edge smoothing processor include processing steps shown in Fig. 3 .
- Output image data are respectively input from the trapping processor 1023 and image forming processor 1024, and an edge smoothing ON/OFF determination process (S301) determines whether or not an edge smoothing setting is ON.
- This setting is input by the UI unit 107, and is saved in the edge smoothing setting save area 1052.
- the edge smoothing processing ends (S306).
- the edge smoothing setting is ON, the subsequent processes are executed while sequentially updating a pixel of interest pixel by pixel. That is, the process advances to a pixel of interest attribute determination process (S302).
- the pixel of interest attribute determination process determines whether or not the attribute of the pixel of interest is the same attribute as that which is set by the UI unit 107 and is saved in the edge smoothing setting save area 1052. If it is determined that the pixel of interest has the same attribute, the process advances to a next pixel of interest density determination process (S303).
- the pixel of interest density determination process determines whether or not the density of the pixel of interest falls within a third reference density range which is set by the UI unit 107 and is saved in the edge smoothing setting save area 1052. If the density of the pixel of interest falls within the third reference density range, the process advances to an edge smoothing process (S304) to apply edge smoothing to the pixel of interest.
- the edge smoothing processing is executed. If any of execution conditions is not satisfied in the determination processes of the pixel of interest attribute determination process (S302) and the pixel of interest density determination process (S303), the process jumps to a pixel update process (S305) without executing the edge smoothing process (S304). After that, in the pixel update process (S305), the pixel of interest is moved to the next pixel if the next pixel is available, and the process returns to the pixel of interest attribute determination process (S302). If no next pixel is available, the processing ends.
- a user interface 600 shown in Fig. 6A is displayed.
- the user selects trapping ON or OFF from trapping ON/OFF buttons 601.
- the user selects a trapping width from trapping width selection buttons 602, and also selects from trapping density buttons 603 a density of a region to be extended by the trapping processing, that is, a density of a region that may overlap a neighboring object.
- a UI window shown in Fig. 6B is displayed, and he or she makes detailed settings on this window.
- a trapping application density range that is, a density range (i.e., the first density range in Fig. 2 ) of the pixel of interest to which trapping is applied is designated using buttons 613.
- a density range i.e., the second density range in Fig. 2 ) of the reference pixel is similarly set using buttons 614.
- the trapping processing execution conditions set by the user via the aforementioned user interfaces are saved in the trapping setting save area 1051. This corresponds to first condition acceptance means which accepts and saves the trapping processing execution conditions.
- Edge smoothing basic settings will be described below with reference to Fig. 7A .
- Edge smoothing ON or OFF is selected from edge smoothing ON/OFF buttons 701.
- An edge smoothing intensity is selected from selection buttons 702 to decide a border density.
- a value "1" sets a lowest density (i.e., weakest smoothing), and a value "5" sets a highest density (i.e., strongest smoothing).
- a UI window in Fig. 7B appears, and allows the user to make detailed settings on this window.
- Edge smoothing detailed settings will be described below with reference to Fig. 7B .
- An attribute to which edge smoothing is to be applied is selected from selection buttons 711 to select an object type such as text, graphic, or image.
- an edge smoothing application density range that is, a density range (i.e., the third density range in Fig. 3 ) within which edge smoothing is executed is designated using buttons 712.
- a density range i.e., the third density range in Fig. 3
- buttons 712 In this case as well, by selecting minimum and maximum densities, it is set to apply edge smoothing to densities between these densities. This corresponds to second condition acceptance means which accepts and saves the edge smoothing processing execution conditions.
- the setting values of respective trapping detailed setting items that is, the execution conditions are read from the trapping setting save area (S402). For example, the density range of the pixel of interest to which trapping is to be applied is read. Then, items and values of conditions, which overlap the trapping execution conditions read in step S402, of the edge smoothing conditions on the user interface window shown in Fig. 7B are re-displayed in a selection disabled state. For example, these items and values are grayed out. For example, when the density is limited as an execution condition, the density range set as the trapping execution condition is displayed on the user interface so as not to be selected as the edge smoothing execution condition. In this state, user settings are accepted, and the accepted setting values are saved (S404).
- trapping and edge smoothing can be avoided from being doubly applied to the same pixel or region. That is, exclusive control can be executed. Then, image quality deterioration due to application of both the processes can be prevented. Furthermore, a trapping effect is generally less conspicuous for a portion where the pixel of interest has a higher density, and edge smoothing is generally effectively applied to a portion where the pixel of interest has a density ranging from a low density to a middle density.
- execution condition is limited in such a manner that, for example, when the trapping application density range is set as a range from 60% to 100% on the trapping UI, the edge smoothing application density values from 70% to 100% are grayed out so as not to be selected on the edge smoothing UI. In this state, this seems to allow an overlapping range from 60% to 70% on the display. However, in this example, this is a limitation in terms of display since the density values are displayed in increments of 10%. Strictly speaking, it is desirably set to execute trapping when the density of the pixel of interest falls within a range from 61% to 100%, and to execute edge smoothing when the density of the pixel of interest falls within a range from 31% to 60%.
- the type of an object to be processed may be limited in place of or in addition to the density.
- a text attribute as the trapping processing execution condition may be inhibited from being selected in the detailed settings on the edge smoothing UI.
- the user selects conditions by means of buttons, but may directly input numerical values.
- the aforementioned settings may be made either when an administrator sets as default settings of the image forming apparatus at the time of initial settings or every time the user performs image formation.
- the UI unit 107 displays a preview display image during or at the time of completion of the settings in Figs. 6A to 7B .
- the settings are reflected to identifiably display regions to which trapping and edge smoothing are to be applied.
- regions to which trapping and edge smoothing are to be applied may be displayed to have different patterns so as to be easily identified, or trapping and edge smoothing simulation images to which the settings are reflected may be displayed.
- the second embodiment will explain a system which automatically executes exclusive control so that the user can have a merit even when he or she does not execute exclusive control in detailed settings.
- the arrangements and operations of the first embodiment except for the sequence explained using Fig. 4 are common to this embodiment.
- the sequence in Fig. 4 may or may not be executed.
- a difference from the first embodiment lies in the operations by a trapping processor and edge smoothing processor, and Figs. 5A and 5B show their sequences.
- Trapping processing will be described below with reference to Fig. 5A .
- This sequence corresponds to detailed processes in this embodiment of step S206 shown in Fig. 2 .
- normal trapping processing is applied to a pixel of interest (S501).
- a trapping completion flag is generated in association with the pixel of interest, so as to pass the control to the edge smoothing processor (S502).
- the generated flag may be temporarily saved in, for example, a memory as partial information of the aforementioned attribute map in place of directly passing the control.
- edge smoothing will be described below with reference to Fig. 5B .
- This sequence corresponds to detailed processes of this embodiment of step S304 shown in Fig. 3 .
- the trapping completion flag associated with a portion where trapping was applied, for example, a pixel or region, is referred to (S511). If the flag is set, edge smoothing is skipped; otherwise, edge smoothing processing is applied (S512).
- a trapping processor 1023 applies trapping to boundaries 902 between the edges of character objects "C" and "B" and a background object 904. That is, trapping processing is applied to a boundary where a density of at least one of a background and foreground (character in this case) whose boundaries contact each other is equal to or higher than, for example, a predetermined density.
- the trapping processing in this case can be implemented by expanding an object with the predetermined density (for example, the background object 904) toward the other object (for example, the character object) side at a boundary with the other object.
- these objects may be exchanged to apply the processing.
- the edge of even one object may include both trapping and non-trapping portions depending on the density of the neighboring object.
- trapping is not applied to a character object portion without any background and a character object 903 with a low background density. That is, when the densities of both objects whose boundaries contact with each other are equal to or lower than the predetermined density, trapping is skipped.
- An edge smoothing processor 1025 does not apply edge smoothing to the portions 902 to which trapping was applied.
- the edge smoothing processor 1025 applies edge smoothing to non-trapping portions 901 and 903, that is, to portions where the density of at least one of a background and foreground (character in this case) whose boundaries contact each other is lower than, for example, a predetermined density.
- pixels or regions which have undergone the trapping processing are controlled not to be selected as edge smoothing processing targets, thus limiting the edge processes to be applied to one process.
- edge smoothing can be effectively applied to a portion where screen jaggies stand out.
- this embodiment inhibits trapping and edge smoothing from being applied to one portion at the same time.
- image deterioration at an edge portion can also be prevented by weakly applying edge smoothing. That is, such processes can be implemented by making the following settings. That is, for a portion where the trapping completion flag is set, the setting intensity of rimming an object in edge smoothing UI settings is set to be "1" irrespective of user settings.
- edge smoothing having an intensity equivalent to "4" is applied to a non-trapping portion.
- the trapping processing is executed in preference to the edge smoothing processing, but the edge smoothing processing may be preferentially executed.
- the edge smoothing processing execution conditions when the edge smoothing processing execution conditions have already been set, the same conditions as these conditions cannot be set as the trapping processing execution conditions. For this reason, the corresponding conditions are grayed out on, for example, the UI.
- the order of the trapping and edge smoothing processes is reversed to execute the edge smoothing processing first, and a flag is generated in association with the processed pixel. Then, at the time of the trapping processing, that flag is referred to. When the flag is set, the trapping processing is skipped even when the conditions are satisfied.
- the exclusive control can also be implemented in this way.
- the first and second embodiments may be combined.
- the trapping processing is executed before dithering and the edge smoothing processing is executed after dithering, the trapping processing is preferentially executed.
- the edge smoothing processing can be preferentially executed, as described above.
- the first and second embodiments have exemplified the case in which the edge smoothing processing is applied to an image after dithering.
- the exclusive control may be executed upon application of trapping and edge smoothing to an image after error diffusion as halftoning different from dithering.
- the preview function described in the last paragraph of the first embodiment may be executed in the second embodiment.
- the image forming apparatus shown in Fig. 8 adopts a so-called tandem-drum engine.
- the present invention can also be applied to an image forming apparatus which adopts a single-drum engine in which image forming units sequentially contact an intermediate transfer member to transfer and overly toner images of respective color components.
- the present invention is not limited to an electrophotography system, and can also be applied to image forming apparatuses of other systems.
- the execution conditions of the trapping and edge smoothing processes are determined for respective pixels, but they may be determined for each region having a given size. In this case, the execution conditions are set or an execution completion flag of, for example, the trapping processing is set in association with that region.
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s).
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
- An embodiment of the invention can provide an image processing apparatus for generating image data, wherein an image is formed by overlapping color plates of at least two colors, characterized by comprising: trapping processing means for applying trapping processing to an object which belongs to either of the color plates of the at least two colors; and smoothing processing means for applying edge smoothing processing to an object which belongs to either of the color plates of the at least two colors, wherein the trapping processing by said trapping processing means 1023 and the edge smoothing processing by said smoothing processing means 1025 are exclusively controlled for respective pixels or regions, or a degree of intensity of trapping processing by said trapping processing means 1023 and a degree of intensity of the edge smoothing processing by said smoothing processing means 1025 are relatively controlled for respective pixels or regions.
- Another embodiment of the invention can provide a method of controlling an image forming apparatus, which forms an image by overlapping color plates of at least two colors, characterized by comprising: applying trapping processing to an object which belongs to either of the color plates of the at least two colors; and applying edge smoothing processing to an object which belongs to either of the color plates of the at least two colors, wherein the trapping processing and the edge smoothing processing are exclusively controlled for respective pixels or regions, or a degree of intensity of trapping processing by said trapping processing and a degree of intensity of the edge smoothing processing are relatively controlled for respective pixels or regions.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
- Image Processing (AREA)
Description
- The present invention relates to an image processing apparatus, which is used to execute, for example, edge processing of an image object, and a control method of an image forming apparatus.
- Some electrophotographic image forming apparatuses include a function of executing trapping processing for slightly overlapping the boundaries of objects such as images, graphics, characters, and the like on different color plates, and edge smoothing processing for smoothing the edge of an object after dithering. The trapping processing is executed to prevent a blank (called a boundary gap) in the boundaries between objects formed by different color plates, by expanding the edge of at least one of the objects whose boundaries contact each other by a predetermined amount. Then, the objects overlap each other to prevent any boundary gap due to displacement of plates. The trapping processing is not executed unconditionally, and it is a common practice to decide whether or not to apply the processing according to the densities of the objects whose boundaries contact each other. This is because when the density of at least one of the objects whose boundaries contact each other is low, a boundary gap is not readily conspicuous if it is generated, and when the densities of both the objects are high, a boundary gap tends to be conspicuous. In addition, a technique for designating trapping for each page (for example, see Japanese Patent Laid-Open No.
2003-87548 2004-237584 - The edge smoothing processing is executed to smooth jaggies of the edge (step-like edge) due to a screen structure visualized by dithering. In the edge smoothing processing, pixels are interpolated with reference to, for example, an original object so as to smoothly connect pixels of the edge of a target object after dithering. Since the edge smoothing processing is also executed for the density at which the screen structure tends to be conspicuous, whether or not to apply the processing is determined based on edge information of an object and, especially, density information of an edge portion. For example, when a density ranges from 0 to 255, since a screen structure due to dithering is conspicuous in a halftone object having a density = 125, edge smoothing is applied to such object. By contrast, in case of an object whose density is low (e.g., a density = 50), if the edge smoothing processing is applied after the dithering, the object is rimmed by the edge smoothing processing, and the density of that portion rises. In particular, when the original density is low, that density rise becomes conspicuous. Also, the jaggies of the edge portion of an object originally having a low density are not so conspicuous. Therefore, it is desirable not to apply edge smoothing to an object having a low density. For this reason, whether or not to apply edge smoothing is determined based on the density information of an object as in trapping (for example, see Japanese Patent No.
4137067 - In this way, both the trapping and edge smoothing processes are applied to enhance the image quality. However, when these processes are used together, various adverse effects occur. For example, by applying the trapping processing, the edges of objects whose boundaries contact each other shift to intrude each other's edges, thus overlapping the objects. This result is inevitably generated by the trapping processing. When the edge smoothing processing is applied to these objects, the edges of the two objects whose boundaries contact each other undergo the edge smoothing processing. For this reason, double edges appear, and these are further emphasized by the edge smoothing processing. Even if the type of a target object of the edge smoothing processing is limited to a character, the edge portion inevitably deteriorates.
Fig. 10 shows this example. Acharacter object 1001 andbackground object 1002 overlap each other near their boundaries as a result of the trapping processing. Then, the edge smoothing processing interpolatespixels 1003 at the edge of the character object that has undergone dithering. In this manner, the edges of regions to which both the trapping and edge smoothing processes are applied are especially excessively emphasized, resulting in considerable image deterioration. -
US 2008/0055654 discloses an apparatus and method for processing an image. When image data is input, it is determined whether to perform trapping at the position of the pixel of interest of the input image data, and trapping is performed for the input image data. In addition, spatial filter processing is performed for the pixel of interest of the input data. An output of the trapping and that of the filter processing are weighted, and image data to be output is calculated. - The present invention has been made in consideration of the aforementioned related arts, and provides an image processing apparatus which can prevent image quality from deteriorating by selectively applying one of trapping and edge smoothing processes to one edge or its portion even when the trapping and edge smoothing processes are applied to one image, and a control method of an image forming apparatus.
- The present invention in its first aspect provides an image processing apparatus as specified in
claim 1. - The present invention in its second aspect provides an image forming method as specified in claim 11.
- The present invention in its third aspect provides a computer program as specified in claim 12. Such a program can be provided by itself or carried by a carrier medium. The carrier medium may be a recording or other storage medium. The carrier medium may also be a transmission medium. The transmission medium may be a signal.
- According to the present invention, even when the trapping and edge smoothing processes are applied to one image, a satisfactory image which can take advantages of image enhancements by the respective processes can be obtained.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
-
Fig. 1 is a schematic block diagram of an image forming apparatus according to an embodiment of the present invention; -
Fig. 2 is a flowchart of a trapping processor; -
Fig. 3 is a flowchart of an edge smoothing processor; -
Fig. 4 is a flowchart of setting acceptance processing required to implement exclusive control of trapping and edge smoothing processes according to the first embodiment; -
Figs. 5A and 5B are flowcharts of trapping and edge smoothing processes required to implement exclusive control of the trapping and edge smoothing processes according to the second embodiment; -
Figs. 6A and 6B are views showing trapping UIs; -
Figs. 7A and 7B are views showing edge smoothing UIs; -
Fig. 8 is a sectional view illustrating the hardware arrangement of animage reading unit 100 andimage output unit 103 of the image forming apparatus shown inFig. 1 ; -
Fig. 9 is a view exemplifying portions having good effects of the present invention; and -
Fig. 10 is an enlarged view of an image that has undergone trapping and edge smoothing processes. - Details of image processing in an image forming apparatus according to an embodiment of the present invention will be described hereinafter with reference to the drawings.
Fig. 1 is a schematic block diagram of an image forming apparatus according to the first embodiment of the present invention. Although this embodiment assumes a digital multifunction peripheral as an image forming apparatus, the present invention can use other print devices such as a color printer in addition to a copying machine. The structure of the image forming apparatus according to this embodiment will be described first. As shown inFig. 1 , animage forming apparatus 800 includes animage reading unit 100,image reception unit 101,UI unit 107,image processing unit 102 which executes various image processes,storage unit 105,CPU 106, andimage output unit 103. Note that theimage forming apparatus 800 can be connected to a server which manages image data, a personal computer (PC) which issues a print execution instruction to this image forming apparatus, and the like via a network such as a LAN or Internet. Also, theimage forming apparatus 800 can be connected to anexternal communication unit 104 via theimage reception unit 101. Theimage processing unit 102 can be configured as an independent image processing apparatus having only image processing functions using a computer or the like. - The functions of the respective units of the image forming apparatus shown in
Fig. 1 will be described below. Theimage reading unit 100 has, for example, an image scanner, and reads an image from, for example, a paper document. For example, theimage reading unit 100 reads an RGB color image. The read RGB data is sent to theimage processing unit 102. Ascanner image processor 1020 applies image processes such as shading correction, block separation processing, and color conversion to the RGB color image data read by theimage reading unit 100. - The
image reception unit 101 receives image data described in a page description language (PDL) (PDL image data) via, for example, a communication line. The PDL image data is a set of commands which describe objects that form an image. Note that theimage forming apparatus 800 can receive image data expressed by a command group associated with individual objects that form an image and can form an image of that image data, in addition to the PDL image data. A case will be exemplified below wherein the PDL image data is received. The PDL image data input to theimage reception unit 101 is sent to aprinter image processor 1021. An interpreter included in theprinter image processor 1021 interprets a command group of the PDL image data, and outputs intermediate code data. A RIP (raster image processor) included in theprinter image processor 1021 rasterizes the intermediate code data to a bitmap image. The bitmap image data generated by the RIP has tones before quantization, and is called a continuous tone image or contone image. On the other hand, theprinter image processor 1021 rasterizes pieces of attribute information (graphic attribute, color attribute, picture attribute, text attribute, thin line attribute, etc.) for respective pixels from attribute information included in the command group. That is, each attribute information indicates a type of an object, and an attribute map indicating the types of objects to each of which a pixel of interest belongs for respective pixels is generated in association with the bitmap image data. Images obtained by thescanner image processor 1020 andprinter image processor 1021 can be temporarily stored in the storage unit 105 (to be described later) (BOX function). When the user selects a desired image from the stored image group using theUI unit 107, subsequent processes are applied to the selected image, and the processed image is output from theimage output unit 103. - A
color processor 1022 accepts data from theimage reading unit 100 orimage reception unit 101, and executes, for example, RGB → CMYK color conversion processing. That is, thecolor processor 1022 converts a color system according to an input image into that of image data, an image of which is formed by theimage forming apparatus 800. Furthermore, atrapping processor 1023 applies trapping to CMYK image data, and animage forming processor 1024 then applies dithering. In the trapping and dithering, pixels which belong to a target object are specified with reference to the attribute map, and the processes are applied to the specified pixels. Finally, anedge smoothing processor 1025 executes edge smoothing processing based on contone image data before the dithering and screen image data after the dithering. The contone image data is input from the trappingprocessor 1023, and the screen image data is input from theimage forming processor 1024. - The arrangements and functions of the
storage unit 105,CPU 106, andimage output unit 103 of the image forming apparatus shown inFig. 1 will be described below. Thestorage unit 105 includes various storage media such as a random access memory (RAM), read-only memory (ROM), and hard disk. For example, the RAM is used as an area for storing data and various kinds of information, and also as a work area of theCPU 106. Furthermore, thestorage unit 105 often includes a nonvolatile rewritable memory. For example, this memory includes a trapping settingsave area 1051 which saves trapping settings, and an edge smoothing settingsave area 1052 which saves edge smoothing settings. The trapping settings include conditions required upon application of trapping processing to a pixel of interest. The edge smoothing settings include conditions required upon application of edge smoothing processing to a pixel of interest. Note that these save areas may be included in the trapping processor and edge smoothing processor, which refer to these areas. On the other hand, the ROM is used as an area for storing various control programs. TheCPU 106 is used to judge and control various kinds of processes according to the programs stored in the ROM. Theimage output unit 103 has a function of outputting an image (for example, a function of forming an image on a print medium such as a print paper sheet and outputting the image). An image deformation processing method for bitmap image data after the RIP processing has been described. Note that the invention of this embodiment can be applied to, for example, the trapping processing which is applied to object data in the RIP. When the trapping processing is applied to object data in the RIP, theprinter image processor 1021 executes that processing. -
Fig. 8 is a view illustrating the hardware arrangement of theimage reading unit 100,image reception unit 101, andimage output unit 103 of theimage forming apparatus 800 such as a digital multifunction equipment shown inFig. 1 , that is, a sectional view of the image forming apparatus. Acontroller 801 includes theimage processing unit 102 inFig. 1 , and also has a function of controlling the overall image forming apparatus. Theimage forming apparatus 800 has copy, printer, and FAX functions. Theimage forming apparatus 800 has a structure for forming an image by overlapping color plates of at least two colors. For this purpose, theimage forming apparatus 800 hasimage forming units 802 for respective color components. Eachimage forming unit 802 forms a toner image developed by a color recording material such as toner for each color component of image data. The formed toner images of the respective color components are transferred onto anintermediate transfer belt 803. For this reason, the images of the respective color components (i.e., color plates) are overlaid with each other on theintermediate transfer belt 803 to form a full-color image. The number of color components of an image formed by theimage forming units 802 are often two or three depending on apparatuses, and five or more colors may be employed. Color image data formed in this way is transferred onto a paper sheet conveyed from eachtray 804, and is heated and fixed on the sheet by a fixingdevice 805. Then, the paper sheet is exhausted on an exhaust tray. - Processing steps from the trapping
processor 1023 to theedge smoothing processor 1025 inFig. 1 will be described below. Details of the processing by the trappingprocessor 1023 shown inFig. 1 include processing steps shown inFig. 2 . A trapping ON/OFF determination process (S201) determines for image data input from thecolor processor 1022 whether or not a trapping setting is ON. This setting is input using theUI unit 107, and is saved in the trapping settingsave area 1051. As a result of determination in step S201, if the trapping setting is OFF, the trapping processing ends (S208). On the other hand, if the trapping setting is ON, the subsequent processes are executed while sequentially updating a pixel of interest pixel by pixel. That is, the process advances to a pixel of interest attribute determination process (S202) first. The pixel of interest attribute determination process (S202) determines whether or not the attribute of a pixel of interest is an attribute which is set by theUI unit 107 and is saved in the trapping settingsave area 1051. Note that the attribute of the pixel of interest can be obtained by acquiring an attribute value associated with that pixel with reference to the attribute map. Note that the attribute can also be called a type of an object. If it is determined that the two attributes match, the process advances to a next surrounding pixel attribute determination process (S203). Note that the trapping processing is processing which copies a surrounding pixel, which belongs to a neighboring object of an object to which the pixel of interest belongs, to the position of the pixel of interest. In this sequence, a surrounding pixel is also called a reference pixel. - The surrounding pixel attribute determination process (S203) determines whether or not a surrounding pixel of the pixel of interest has the same attribute as that which is set by the
UI unit 107 and is saved in the trapping settingsave area 1051. If it is determined that the surrounding pixel has the same attribute, the process advances to a next pixel of interest density determination process (S204). The pixel of interest density determination process (S204) determines whether or not the density of the pixel of interest falls within a first reference density range, which is set by theUI unit 107 and is saved in the trapping settingsave area 1051. If the density of the pixel of interest falls within the first reference density range, the process advances to a next reference pixel density determination process (S205). The reference pixel density determination process (S205) determines if the density of the reference pixel falls within a second reference density range which is set by theUI unit 107 and is saved in the trapping settingsave area 1051. If the density of the reference pixel falls within the second reference density range, the process advances to a trapping process (S206) to apply trapping to the pixel of interest. That is, only when the pixel of interest satisfies all trapping processing execution conditions, the trapping processing is executed. When any of the execution conditions is not satisfied in the determination processes from the pixel of interest attribute determination process (S202) to the reference pixel density determination process (S205), the process jumps to a pixel update process (S207) without executing the trapping process (S206). In the pixel update process (S207), the pixel of interest is moved to the next pixel if the next pixel is available, and the process returns to the pixel of interest attribute determination process (S202). If no next pixel is available, the processing ends. The next pixel can be decided in, for example, a raster order. - After the
trapping processor 1023, theimage forming processor 1024 applies dithering, and the control then transitions to the edge smoothing processor. - Details of the processing by the edge smoothing processor include processing steps shown in
Fig. 3 . Output image data are respectively input from the trappingprocessor 1023 andimage forming processor 1024, and an edge smoothing ON/OFF determination process (S301) determines whether or not an edge smoothing setting is ON. This setting is input by theUI unit 107, and is saved in the edge smoothing settingsave area 1052. As a result of determination, if the edge smoothing setting is OFF, the edge smoothing processing ends (S306). On the other hand, if the edge smoothing setting is ON, the subsequent processes are executed while sequentially updating a pixel of interest pixel by pixel. That is, the process advances to a pixel of interest attribute determination process (S302). The pixel of interest attribute determination process (S302) determines whether or not the attribute of the pixel of interest is the same attribute as that which is set by theUI unit 107 and is saved in the edge smoothing settingsave area 1052. If it is determined that the pixel of interest has the same attribute, the process advances to a next pixel of interest density determination process (S303). The pixel of interest density determination process (S303) determines whether or not the density of the pixel of interest falls within a third reference density range which is set by theUI unit 107 and is saved in the edge smoothing settingsave area 1052. If the density of the pixel of interest falls within the third reference density range, the process advances to an edge smoothing process (S304) to apply edge smoothing to the pixel of interest. That is, only when the pixel of interest satisfies all the edge smoothing processing execution conditions, the edge smoothing processing is executed. If any of execution conditions is not satisfied in the determination processes of the pixel of interest attribute determination process (S302) and the pixel of interest density determination process (S303), the process jumps to a pixel update process (S305) without executing the edge smoothing process (S304). After that, in the pixel update process (S305), the pixel of interest is moved to the next pixel if the next pixel is available, and the process returns to the pixel of interest attribute determination process (S302). If no next pixel is available, the processing ends. - The processing steps from the trapping
processor 1023 to theedge smoothing processor 1025 inFig. 1 have been described. - The basic and detailed setting methods of the trapping and edge smoothing functions by the
UI unit 107 will be described below. Trapping basic settings will be described first with reference toFig. 6A . - When the user inputs a trapping processing setting instruction, a
user interface 600 shown inFig. 6A is displayed. On thisuser interface 600, the user selects trapping ON or OFF from trapping ON/OFF buttons 601. The user selects a trapping width from trappingwidth selection buttons 602, and also selects from trapping density buttons 603 a density of a region to be extended by the trapping processing, that is, a density of a region that may overlap a neighboring object. Furthermore, when the user selects adetailed setting button 604, a UI window shown inFig. 6B is displayed, and he or she makes detailed settings on this window. - The trapping detailed settings will be described below with reference to
Fig. 6B . Note that button selections are made by the user in the following description. Attributes to which trapping is to be applied to a pixel of interest and reference pixels are respectively selected usingselection buttons Fig. 2 ) of the pixel of interest to which trapping is applied is designated usingbuttons 613. In this case, by selecting a minimum density (0) and maximum density (100), it is set to apply trapping to densities between these densities. A density range (i.e., the second density range inFig. 2 ) of the reference pixel is similarly set usingbuttons 614. - The trapping processing execution conditions set by the user via the aforementioned user interfaces are saved in the trapping setting
save area 1051. This corresponds to first condition acceptance means which accepts and saves the trapping processing execution conditions. - Edge smoothing basic settings will be described below with reference to
Fig. 7A . Note that selections are made by the user. Edge smoothing ON or OFF is selected from edge smoothing ON/OFF buttons 701. An edge smoothing intensity is selected fromselection buttons 702 to decide a border density. InFig. 7A , assume that a value "1" sets a lowest density (i.e., weakest smoothing), and a value "5" sets a highest density (i.e., strongest smoothing). Furthermore, upon selection of adetailed setting button 703, a UI window inFig. 7B appears, and allows the user to make detailed settings on this window. - Edge smoothing detailed settings will be described below with reference to
Fig. 7B . An attribute to which edge smoothing is to be applied is selected fromselection buttons 711 to select an object type such as text, graphic, or image. Next, an edge smoothing application density range, that is, a density range (i.e., the third density range inFig. 3 ) within which edge smoothing is executed is designated usingbuttons 712. In this case as well, by selecting minimum and maximum densities, it is set to apply edge smoothing to densities between these densities. This corresponds to second condition acceptance means which accepts and saves the edge smoothing processing execution conditions. - The processing sequence executed upon selection of the
button 703 inFig. 7A will be described below with reference toFig. 4 . Initially, the user interface window shown inFig. 7B is displayed (S400). At this time, if arbitrary settings have already been made, these setting values are read out from the edge smoothing settingsave area 1052 and are displayed. It is checked if both the trapping and edge smoothing functions are simultaneously selected (S401). This determination process is implemented by referring to the trapping settingsave area 1051 and edge smoothing settingsave area 1052. In this case, if settled values have not been saved yet, temporarily saved settings are referred to. If it is set to apply both the functions, the setting values of respective trapping detailed setting items, that is, the execution conditions are read from the trapping setting save area (S402). For example, the density range of the pixel of interest to which trapping is to be applied is read. Then, items and values of conditions, which overlap the trapping execution conditions read in step S402, of the edge smoothing conditions on the user interface window shown inFig. 7B are re-displayed in a selection disabled state. For example, these items and values are grayed out. For example, when the density is limited as an execution condition, the density range set as the trapping execution condition is displayed on the user interface so as not to be selected as the edge smoothing execution condition. In this state, user settings are accepted, and the accepted setting values are saved (S404). - In this manner, by limiting trapping or edge smoothing application conditions so as not to overlap each other at the time of settings of these execution conditions, trapping and edge smoothing can be avoided from being doubly applied to the same pixel or region. That is, exclusive control can be executed. Then, image quality deterioration due to application of both the processes can be prevented. Furthermore, a trapping effect is generally less conspicuous for a portion where the pixel of interest has a higher density, and edge smoothing is generally effectively applied to a portion where the pixel of interest has a density ranging from a low density to a middle density. By selectively using the processes depending on density ranges, that is, trapping for a range from a middle density to a high density and edge smoothing for a range from a low density to a middle density, a high-quality image which makes good use of their original features can be generated.
- Note that execution condition is limited in such a manner that, for example, when the trapping application density range is set as a range from 60% to 100% on the trapping UI, the edge smoothing application density values from 70% to 100% are grayed out so as not to be selected on the edge smoothing UI. In this state, this seems to allow an overlapping range from 60% to 70% on the display. However, in this example, this is a limitation in terms of display since the density values are displayed in increments of 10%. Strictly speaking, it is desirably set to execute trapping when the density of the pixel of interest falls within a range from 61% to 100%, and to execute edge smoothing when the density of the pixel of interest falls within a range from 31% to 60%.
- As the execution condition of the edge smoothing processing, the type of an object to be processed may be limited in place of or in addition to the density. When only a text attribute is designated as that to which the trapping processing is applied, a text attribute as the trapping processing execution condition may be inhibited from being selected in the detailed settings on the edge smoothing UI.
- The case has been explained wherein the trapping processing execution conditions are set first, and the edge smoothing execution conditions are then set. By contrast, when the edge smoothing settings are made first, items and values selected as the edge smoothing execution conditions may be grayed out so as not to be selected when the trapping processing settings are to be made, thus achieving the same object.
- In the above example, the user selects conditions by means of buttons, but may directly input numerical values. Note that the aforementioned settings may be made either when an administrator sets as default settings of the image forming apparatus at the time of initial settings or every time the user performs image formation.
- When the settings are made every time the user performs image formation, an image to be formed is read out from the aforementioned BOX, and the
UI unit 107 displays a preview display image during or at the time of completion of the settings inFigs. 6A to 7B . Then, the settings are reflected to identifiably display regions to which trapping and edge smoothing are to be applied. With this display, respective regions to which trapping and edge smoothing are to be applied may be displayed to have different patterns so as to be easily identified, or trapping and edge smoothing simulation images to which the settings are reflected may be displayed. - The second embodiment will explain a system which automatically executes exclusive control so that the user can have a merit even when he or she does not execute exclusive control in detailed settings. The arrangements and operations of the first embodiment except for the sequence explained using
Fig. 4 are common to this embodiment. The sequence inFig. 4 may or may not be executed. A difference from the first embodiment lies in the operations by a trapping processor and edge smoothing processor, andFigs. 5A and 5B show their sequences. - Trapping processing will be described below with reference to
Fig. 5A . This sequence corresponds to detailed processes in this embodiment of step S206 shown inFig. 2 . As in the first embodiment, normal trapping processing is applied to a pixel of interest (S501). After that, a trapping completion flag is generated in association with the pixel of interest, so as to pass the control to the edge smoothing processor (S502). Of course, the generated flag may be temporarily saved in, for example, a memory as partial information of the aforementioned attribute map in place of directly passing the control. - On the other hand, edge smoothing will be described below with reference to
Fig. 5B . This sequence corresponds to detailed processes of this embodiment of step S304 shown inFig. 3 . The trapping completion flag associated with a portion where trapping was applied, for example, a pixel or region, is referred to (S511). If the flag is set, edge smoothing is skipped; otherwise, edge smoothing processing is applied (S512). - In the second embodiment, for example, with respect to image data shown in
Fig. 9 , atrapping processor 1023 applies trapping toboundaries 902 between the edges of character objects "C" and "B" and abackground object 904. That is, trapping processing is applied to a boundary where a density of at least one of a background and foreground (character in this case) whose boundaries contact each other is equal to or higher than, for example, a predetermined density. The trapping processing in this case can be implemented by expanding an object with the predetermined density (for example, the background object 904) toward the other object (for example, the character object) side at a boundary with the other object. Of course, these objects may be exchanged to apply the processing. However, in this case, the edge of even one object may include both trapping and non-trapping portions depending on the density of the neighboring object. On the other hand, trapping is not applied to a character object portion without any background and acharacter object 903 with a low background density. That is, when the densities of both objects whose boundaries contact with each other are equal to or lower than the predetermined density, trapping is skipped. - An
edge smoothing processor 1025 does not apply edge smoothing to theportions 902 to which trapping was applied. On the other hand, theedge smoothing processor 1025 applies edge smoothing tonon-trapping portions - In this manner, pixels or regions which have undergone the trapping processing are controlled not to be selected as edge smoothing processing targets, thus limiting the edge processes to be applied to one process. As a result, any boundary gap upon displacement of color plates can be prevented by applying the trapping processing. On the other hand, edge smoothing can be effectively applied to a portion where screen jaggies stand out.
- Note that this embodiment inhibits trapping and edge smoothing from being applied to one portion at the same time. However, image deterioration at an edge portion can also be prevented by weakly applying edge smoothing. That is, such processes can be implemented by making the following settings. That is, for a portion where the trapping completion flag is set, the setting intensity of rimming an object in edge smoothing UI settings is set to be "1" irrespective of user settings. On the other hand, edge smoothing having an intensity equivalent to "4" is applied to a non-trapping portion.
- In the above two embodiments, the trapping processing is executed in preference to the edge smoothing processing, but the edge smoothing processing may be preferentially executed. In this case, in the first embodiment, when the edge smoothing processing execution conditions have already been set, the same conditions as these conditions cannot be set as the trapping processing execution conditions. For this reason, the corresponding conditions are grayed out on, for example, the UI. In the second embodiment, the order of the trapping and edge smoothing processes is reversed to execute the edge smoothing processing first, and a flag is generated in association with the processed pixel. Then, at the time of the trapping processing, that flag is referred to. When the flag is set, the trapping processing is skipped even when the conditions are satisfied. The exclusive control can also be implemented in this way. Also, the first and second embodiments may be combined. In this example, since the trapping processing is executed before dithering and the edge smoothing processing is executed after dithering, the trapping processing is preferentially executed. However, when the trapping processing is applied to image data after dithering, the edge smoothing processing can be preferentially executed, as described above. The first and second embodiments have exemplified the case in which the edge smoothing processing is applied to an image after dithering. Alternatively, the exclusive control may be executed upon application of trapping and edge smoothing to an image after error diffusion as halftoning different from dithering. Also, the preview function described in the last paragraph of the first embodiment may be executed in the second embodiment.
- The image forming apparatus shown in
Fig. 8 adopts a so-called tandem-drum engine. Alternatively, the present invention can also be applied to an image forming apparatus which adopts a single-drum engine in which image forming units sequentially contact an intermediate transfer member to transfer and overly toner images of respective color components. Also, the present invention is not limited to an electrophotography system, and can also be applied to image forming apparatuses of other systems. - In the above example, the execution conditions of the trapping and edge smoothing processes are determined for respective pixels, but they may be determined for each region having a given size. In this case, the execution conditions are set or an execution completion flag of, for example, the trapping processing is set in association with that region.
- Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
- An embodiment of the invention can provide an image processing apparatus for generating image data, wherein an image is formed by overlapping color plates of at least two colors, characterized by comprising: trapping processing means for applying trapping processing to an object which belongs to either of the color plates of the at least two colors; and smoothing processing means for applying edge smoothing processing to an object which belongs to either of the color plates of the at least two colors, wherein the trapping processing by said trapping processing means 1023 and the edge smoothing processing by said smoothing processing means 1025 are exclusively controlled for respective pixels or regions, or a degree of intensity of trapping processing by said trapping processing means 1023 and a degree of intensity of the edge smoothing processing by said smoothing processing means 1025 are relatively controlled for respective pixels or regions.
- Another embodiment of the invention can provide a method of controlling an image forming apparatus, which forms an image by overlapping color plates of at least two colors, characterized by comprising: applying trapping processing to an object which belongs to either of the color plates of the at least two colors; and applying edge smoothing processing to an object which belongs to either of the color plates of the at least two colors, wherein the trapping processing and the edge smoothing processing are exclusively controlled for respective pixels or regions, or a degree of intensity of trapping processing by said trapping processing and a degree of intensity of the edge smoothing processing are relatively controlled for respective pixels or regions.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Claims (12)
- An image processing apparatus (800) for generating image data pixel by pixel, wherein an image is formed by overlapping color plates of at least two colors, comprising:an image reception unit (101) configured to receive image data comprising objects;first determination means configured to determine (S202-S205) to perform trapping processing at a position of a pixel of interest of the received image data if: an object type to which it is determined that the pixel of interest belongs (S202) matches an object type for a pixel of interest specified by saved trapping processing execution conditions; an object type to which it is determined that a surrounding pixel belongs (S203) matches an object type for a reference pixel specified by the saved trapping processing execution conditions; a determined density of the pixel of interest falls within a first reference density range specified by saved trapping processing execution conditions (S204); and a determined density of the surrounding pixel falls within a second density range specified by saved trapping processing execution conditions (S205);trapping processing means (1023) for applying trapping processing (S206) to an object belonging to either of the color plates of the at least two colors of the received image data in accordance with a determination to perform trapping processing by the first determination means, wherein the trapping processing is for preventing boundary gaps in boundaries between objects formed by different color plates, by expanding an edge of at least one of the objects by a predetermined amount;means for obtaining screen image data by performing dither processing on the received image data to which the trapping processing has been applied;second determination means configured to determine (S302-S303) to perform edge smoothing processing at a position of a pixel of interest of the screen image data if: an object type to which it is determined that the pixel of interest belongs (S302) matches an object type specified by the edge smoothing processing execution conditions; and the determined density of the pixel of interest falls within a third density range specified by the edge smoothing processing execution conditions (S303);smoothing processing means (1025) for applying edge smoothing processing (S304) to an object belonging to either of the color plates of the at least two colors of the screen image data in accordance with a determination to perform edge smoothing processing by the second determination means, wherein the edge smoothing processing is for smoothing jaggies of an edge of an object by interpolating pixels; andwherein the trapping processing by the trapping processing means and the edge smoothing processing by the smoothing processing means are avoided from being doubly applied to one pixel by saving the trapping processing execution conditions and the edge smoothing processing execution conditions which specify the first reference density range and the third reference density range not overlapping each other.
- The apparatus according to claim 1, further comprising:first condition acceptance means (610) for accepting and saving an execution condition specifying the first density range (613) for determining whether or not to perform the trapping processing to a pixel of interest; andsecond condition acceptance means (710) for accepting and saving an execution condition specifying the third density range (712) for determining whether or not to perform the edge smoothing processing to a pixel of interest,wherein said first and second condition acceptance means (610, 710) are configured so that a setting of the density range in one of the first acceptance means and the second acceptance means is limited by a setting of the density range in the other one such that an execution condition which overlaps the execution condition saved by one of the condition acceptance means (610, 710) is not accepted as an execution condition by the other one of the condition acceptance means (610, 710), andsaid trapping processing means (1023) being configured to execute the trapping processing according to the execution condition saved by said first condition acceptance means (610), and said smoothing processing means being configured to execute the edge smoothing processing according to the execution condition saved by said second condition acceptance means (710).
- The apparatus according to any preceding claim, further comprising selecting means configured to selectively apply one of the trapping processing means (1023) and the smoothing processing means (1025) to pixel of an object.
- The apparatus according to claim 3, wherein the selecting means is further configured to selectively apply one of the trapping processing means (1023) and the smoothing processing means (1025) to a pixel of the object in dependence upon density of the pixel.
- The apparatus according to claim 4, wherein the trapping processing means (1023) is configured to apply trapping processing to a boundary where a density of at least one of a first object and a second object whose boundaries contact each other is equal to or higher than a predetermined density.
- The apparatus according to claim 4 or claim 5, wherein the smooth processing means (1025) is configured to apply edge smoothing processing to a boundary where a density of at least one of a first object and a second object whose boundaries contact each other is equal to or lower than a predetermined density.
- The apparatus according to any preceding claim, wherein the smoothing processing means (1025) is configured, when an object comprises at least one trapping portion to which trapping has been applied and at least one non-trapping portion to which trapping has not been applied, to apply edge smoothing processing to the at least one non-trapping portion.
- The apparatus according to any preceding claim, further comprising a display means configured to display a preview display of pixels of an object to which trapping processing and edge smoothing processing is to be applied.
- The apparatus according to claim 8, wherein the display means is configured to display pixels of objects to which trapping processing and edge smoothing processing are to be applied using different appearances or using simulation imaging.
- The apparatus according to claim 8 or claim 9, wherein the display means is configured to inhibit selection of an execution condition of one of said first and second condition acceptance means (1023, 1025) when the execution condition overlaps with an execution condition saved by the other condition acceptance means.
- A method of controlling an image forming apparatus which generates image data pixel by pixel and which forms an image by overlapping color plates of at least two colors, comprising:a receiving step of receiving image data comprising objects; a trapping determination step (S202-S205) of determining to perform trapping processing at a position of a pixel of interest of the received image data if: an object type to which it is determined that the pixel of interest belongs matches an object type for a pixel of interest specified by saved trapping processing execution conditions (S202); an object type to which it is determined that a surrounding pixel belongs matches an object type for a reference pixel specified by the saved trapping processing execution conditions (S203); a determined density of the pixel of interest falls within a first reference density range specified by saved trapping processing execution conditions (S204); and a determined density of the surrounding pixel falls within a second reference density range specified by saved trapping processing execution conditions (S205);a trapping step (S206) of applying trapping processing to an object belonging to either of the color plates of the at least two colors of the received image data in accordance with a determination to perform trapping processing in the trapping determination step, wherein the trapping processing is for preventing boundary gaps in boundaries between objects formed by different color plates, by expanding an edge of at least one of the objects by a predetermined amount;a step of obtaining screen image data by performing dither processing on the received image data to which the trapping processing has been applied;an edge smoothing determination step (S302-S303) of determining to perform edge smoothing processing at a position of a pixel of interest of the screen image data if: an object type to which it is determined that the pixel of interest belongs (S302) matches an object type specified by the edge smoothing execution conditions; and the determined third density of the pixel of interest falls within a third density range specified by the edge smoothing execution conditions (S303);an edge smoothing step (S304) of applying edge smoothing processing to an object belonging to either of the color plates of the at least two colors of the screen image data in accordance with a determination to perform edge smoothing processing in the edge smoothing determination step, wherein the edge smoothing processing is for smoothing jaggies of an edge of an object by interpolating pixels; andwherein the trapping processing and the edge smoothing processing are avoided from being doubly applied to one pixel by saving the trapping processing execution conditions and the edge smoothing processing execution conditions which specify the first reference density range and the third reference density range not overlapping each other.
- A program which, when executed by a computer, causes the computer to carry out the method of claim 11.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009069009A JP5060502B2 (en) | 2009-03-19 | 2009-03-19 | Image processing apparatus and image processing apparatus control method |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2230830A2 EP2230830A2 (en) | 2010-09-22 |
EP2230830A3 EP2230830A3 (en) | 2010-12-22 |
EP2230830B1 true EP2230830B1 (en) | 2017-05-24 |
Family
ID=42236651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP10156958.0A Active EP2230830B1 (en) | 2009-03-19 | 2010-03-18 | Image processing apparatus and control method of image forming apparatus |
Country Status (5)
Country | Link |
---|---|
US (1) | US8416460B2 (en) |
EP (1) | EP2230830B1 (en) |
JP (1) | JP5060502B2 (en) |
KR (1) | KR101223865B1 (en) |
CN (1) | CN101841632B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8417046B1 (en) | 2008-11-10 | 2013-04-09 | Marvell International Ltd. | Shadow and highlight image enhancement |
JP4991893B2 (en) | 2010-03-16 | 2012-08-01 | 常陽機械株式会社 | Method and apparatus for determining pass / fail of minute diameter wire bonding |
US8958636B1 (en) * | 2010-07-28 | 2015-02-17 | Marvell International Ltd. | Configurable color trapping |
JP5282800B2 (en) * | 2011-06-23 | 2013-09-04 | 富士ゼロックス株式会社 | Image processing apparatus and program |
JP6128827B2 (en) * | 2012-12-18 | 2017-05-17 | キヤノン株式会社 | Image processing apparatus, control method therefor, and program |
JP6186828B2 (en) * | 2013-04-17 | 2017-08-30 | コニカミノルタ株式会社 | Image processing apparatus, image processing apparatus control method, and image processing apparatus control program |
US9131182B2 (en) * | 2013-05-08 | 2015-09-08 | Canon Kabushiki Kaisha | Image processing apparatus, method and storage medium |
JP6381311B2 (en) * | 2013-07-04 | 2018-08-29 | キヤノン株式会社 | Image forming apparatus, image forming method, and program |
US9135535B1 (en) * | 2014-06-09 | 2015-09-15 | Xerox Corporation | Method and system for prorating trapping parameters globally with respect to object size |
JP6213517B2 (en) * | 2015-04-21 | 2017-10-18 | コニカミノルタ株式会社 | Image processing apparatus and image processing method |
JP6516224B2 (en) * | 2015-09-02 | 2019-05-22 | シャープ株式会社 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM |
JP2018074497A (en) | 2016-11-02 | 2018-05-10 | キヤノン株式会社 | Image processing device, image processing method and program |
JP6944300B2 (en) | 2017-08-01 | 2021-10-06 | キヤノン株式会社 | Image processing equipment, image processing methods, and programs |
JP7313879B2 (en) | 2019-04-08 | 2023-07-25 | キヤノン株式会社 | Image processing device, image processing method and program |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04137067A (en) | 1990-09-28 | 1992-05-12 | Tokyo Electric Co Ltd | Automatic ordering device |
US5581667A (en) | 1994-12-23 | 1996-12-03 | Xerox Corporation | Electronic trapping system for digitized text and images |
US5771318A (en) * | 1996-06-27 | 1998-06-23 | Siemens Corporate Research, Inc. | Adaptive edge-preserving smoothing filter |
JPH1028225A (en) | 1996-07-12 | 1998-01-27 | Ricoh Co Ltd | Mtf-correcting device |
JP2003087548A (en) * | 2001-09-12 | 2003-03-20 | Fuji Photo Film Co Ltd | Method for applying trapping condition |
JP4324359B2 (en) | 2002-10-04 | 2009-09-02 | 株式会社リコー | Image processing apparatus and image processing method |
JP2004237584A (en) * | 2003-02-06 | 2004-08-26 | Seiko Epson Corp | Image processing apparatus |
US7391536B2 (en) * | 2004-07-09 | 2008-06-24 | Xerox Corporation | Method for smooth trapping suppression of small graphical objects using color interpolation |
JP2006074274A (en) * | 2004-08-31 | 2006-03-16 | Fuji Xerox Co Ltd | Image processor and method for correcting image |
JP4137067B2 (en) * | 2005-03-04 | 2008-08-20 | キヤノン株式会社 | Image processing method and apparatus |
JP5017898B2 (en) | 2006-03-17 | 2012-09-05 | コニカミノルタビジネステクノロジーズ株式会社 | Image processing method, image forming apparatus, and computer program |
JP5020573B2 (en) | 2006-09-01 | 2012-09-05 | キヤノン株式会社 | Image processing apparatus and method |
JP4979357B2 (en) | 2006-12-04 | 2012-07-18 | キヤノン株式会社 | Image forming apparatus and control method thereof |
US7961354B2 (en) | 2006-12-18 | 2011-06-14 | Canon Kabushiki Kaisha | Image-forming apparatus and method for adjusting total amount of toner in trap |
JP5024817B2 (en) | 2007-04-09 | 2012-09-12 | キヤノン株式会社 | Image processing apparatus, image processing method, program thereof, and storage medium |
JP5078480B2 (en) * | 2007-07-23 | 2012-11-21 | キヤノン株式会社 | Image processing apparatus and method, and computer program and recording medium |
-
2009
- 2009-03-19 JP JP2009069009A patent/JP5060502B2/en active Active
-
2010
- 2010-03-02 US US12/716,143 patent/US8416460B2/en active Active
- 2010-03-18 CN CN201010139039.3A patent/CN101841632B/en active Active
- 2010-03-18 EP EP10156958.0A patent/EP2230830B1/en active Active
- 2010-03-19 KR KR1020100024586A patent/KR101223865B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
KR20100105497A (en) | 2010-09-29 |
EP2230830A3 (en) | 2010-12-22 |
US8416460B2 (en) | 2013-04-09 |
CN101841632B (en) | 2014-06-11 |
JP5060502B2 (en) | 2012-10-31 |
JP2010226252A (en) | 2010-10-07 |
EP2230830A2 (en) | 2010-09-22 |
CN101841632A (en) | 2010-09-22 |
KR101223865B1 (en) | 2013-01-17 |
US20100238468A1 (en) | 2010-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2230830B1 (en) | Image processing apparatus and control method of image forming apparatus | |
JP6031286B2 (en) | Image processing apparatus and image processing method | |
JP4498233B2 (en) | Image processing apparatus and image processing method | |
JP5874721B2 (en) | Image processing apparatus, image correction method, and program | |
JP2010246040A (en) | Image processing unit, image-forming device, method of processing image, computer program, and recording medium | |
JP2009070128A (en) | Image processor and image processing method | |
JP2012043406A (en) | Image processing apparatus, image processing method and program | |
JP2022173510A (en) | Image processing apparatus, image processing method, and program | |
US8482790B2 (en) | Image forming apparatus and image processing method | |
JP5095667B2 (en) | Output instruction apparatus, image forming system, output instruction method, control program, and recording medium therefor | |
JP6271874B2 (en) | Image processing apparatus and image processing method | |
JP6995594B2 (en) | Information processing equipment, information processing methods and programs | |
JP6688193B2 (en) | Image processing apparatus, image forming apparatus, image processing method, and image processing program | |
JP4228804B2 (en) | Image processing apparatus, image forming apparatus, image processing method, program thereof, and recording medium recording the program | |
JP5422520B2 (en) | Image processing device | |
JP5361830B2 (en) | Image processing device | |
JP2010263313A (en) | Image processing apparatus, image forming apparatus, image processing method, computer program, and recording medium | |
JP6155646B2 (en) | Image forming apparatus and image processing method | |
JP2006163557A (en) | Apparatus, method and program for image processing, and apparatus, method and program for print control | |
JP6167698B2 (en) | Image forming apparatus and positional deviation correction method | |
JP2007201611A (en) | Image processing apparatus, image forming apparatus, image processing method, image forming method, and program | |
JP2002290752A (en) | Color image processor | |
JP2006350402A (en) | Image processor, image formation apparatus, image processing method, image processing program and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA ME RS |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA ME RS |
|
17P | Request for examination filed |
Effective date: 20110622 |
|
17Q | First examination report despatched |
Effective date: 20150522 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20161212 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 896519 Country of ref document: AT Kind code of ref document: T Effective date: 20170615 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602010042517 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20170524 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 896519 Country of ref document: AT Kind code of ref document: T Effective date: 20170524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170824 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170825 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170824 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170924 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602010042517 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20180227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20180331 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180318 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180318 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20190327 Year of fee payment: 10 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180318 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20100318 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170524 Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170524 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20200318 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200318 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240220 Year of fee payment: 15 |