US9002103B2 - Image processing apparatus and computer readable medium - Google Patents
Image processing apparatus and computer readable medium Download PDFInfo
- Publication number
- US9002103B2 US9002103B2 US12/725,056 US72505610A US9002103B2 US 9002103 B2 US9002103 B2 US 9002103B2 US 72505610 A US72505610 A US 72505610A US 9002103 B2 US9002103 B2 US 9002103B2
- Authority
- US
- United States
- Prior art keywords
- image
- region
- concealing
- color region
- color
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/38—Circuits or arrangements for blanking or otherwise eliminating unwanted parts of pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0021—Image watermarking
- G06T1/0028—Adaptive watermarking, e.g. Human Visual System [HVS]-based watermarking
Definitions
- the present invention relates to an image processing apparatus and a computer readable medium.
- Disclosure is often carried out for documents owned by administrative organizations and the like. Nevertheless, in some cases, these documents contain information that is required to be concealed in view of protection of personal information and the like. Thus, when these documents are to be disclosed, parts containing such information that is required to be concealed are blacked out or the like.
- an image processing apparatus includes a definition reducing section, a color region extracting section, a concealing image generating section and an image combining section.
- the definition reducing section based on a received image, generates an image having a lower definition than the received image.
- the color region extracting section extracts a color region from the low definition image generated by the definition reducing section.
- the concealing image generating section based on the color region extracted by the color region extracting section, converts the color region into a concealing region for concealing a part of the received image and, generates a concealing image including the concealing region.
- the image combining section combines the received image with the concealing image generated by the concealing image generating section.
- FIG. 1 is a conceptual module configuration diagram for an exemplary configuration according to a first exemplary embodiment
- FIG. 2 is an explanation diagram showing an example of marker masking
- FIGS. 3A , 3 B and 3 C are explanation diagrams for showing an example of an image having undergone masking
- FIGS. 4A and 4B are explanation diagrams for showing an example of an image having undergone masking
- FIG. 5 is an explanation diagram showing an example of an image having undergone masking
- FIGS. 6A , 6 B, 6 C and 6 D are explanation diagrams for showing an example of marker masking according to a first exemplary embodiment
- FIGS. 7A , 7 B, 7 C are explanation diagrams for showing an example of an image having undergone masking according to a first exemplary embodiment
- FIG. 8 is a conceptual module configuration diagram for an exemplary configuration according to a second exemplary embodiment
- FIG. 9 is an explanation diagram showing an example of a region that has been determined as a color region but that is not a masking region
- FIG. 10 is an explanation diagram showing an example of pixels to be determined as a masking region
- FIG. 11 is a conceptual module configuration diagram for an exemplary configuration according to a third exemplary embodiment.
- FIG. 12 is a configuration diagram showing an example of a hardware configuration of a computer for implementing first to third exemplary embodiments.
- FIG. 1 is a conceptual module configuration diagram for an exemplary configuration according to a first exemplary embodiment.
- module indicates a component of software (a computer program), hardware, or the like which is logically separable from other parts in general.
- a module in the present exemplary embodiment indicates not only a module in a computer program but also a module in a hardware configuration. Accordingly, the present exemplary embodiment serves also as descriptions of a computer program, a system, and a method.
- the expressions “to store” and “to cause something to store” and other equivalent expressions are used. Then, in an exemplary embodiment of a computer program, these expressions indicate “to store into a storage device”, “to perform control such as to cause to store into a storage device”, and the like.
- each module may be in one-to-one correspondence to a function.
- one module may be constructed from a single program.
- a plurality of modules may be constructed from a single program.
- a single module may be constructed from a plurality of programs.
- a plurality of modules may be executed by a single computer.
- a single module may be executed by a plurality of computers in a distributed or parallel computing environment.
- a module may contain another module.
- connection indicates a physical connection as well as a logical connection (data transfer, instructions, and reference relations between data, and the like).
- system and “apparatus” indicate a configuration constructed by connecting a plurality of computers, hardware pieces, apparatuses, and the like through a communication such as a network (including a communication connection of one-to-one correspondence), as well as a configuration implemented by a single computer, hardware piece, apparatus, or the like.
- apparatus and “system” are used as synonymous terms to each other.
- the expression “defined in advance” indicates that something is defined before a processing of interest. This includes timing before the start of the entire processing according to the present exemplary embodiment, as well as timing even after the start of processing according to the present exemplary embodiment as long as it is before the start of a particular processing piece of interest. That is, the expression indicates that something is defined in accordance with a situation or a state at that time, or alternatively in accordance with a situation and a state until that time.
- the image processing apparatus processes an image having a region marked by a user, and based on the marked region, conceals (referred to as “to mask”, in some cases hereinafter) the partial region of the image.
- the image processing apparatus has an image receiving module 110 , a definition reducing module 120 , a color region extracting module 130 , a mask image generating module 140 , an attribute commonizing module 150 , a graying module 160 , an image combining module 170 , and an image output module 180 .
- the image receiving module 110 is connected to the definition reducing module 120 and the graying module 160 .
- the image receiving module 110 receives an image and then transfers the image to the definition reducing module 120 and the graying module 160 .
- the expression “to receive an image” indicates, for example, “to read an image through a scanner, a camera, or the like”, “to receive an image from an external device through a facsimile or the like via a communication line”, and “to read an image stored in a hard disk (one built in a computer or alternatively one connected via a network) or the like”.
- the image is a multi-valued image serving as a color image. A single image or a plurality of images may be received.
- the contents of the image are a document such as a report that is basically to be disclosed but contains a part to be made confidential. That is, the to-be-processed document is generated by a user marking a document, like painting or encircling a region to be concealed by using a color pen. This marking is performed with a color pen or the like of translucent color other than black. Thus, even when the ink is applied over black characters, the characters are seen clearly.
- This pen is of a kind called a marker pen, a highlighter pen, and the like.
- a part marked with a marker pen or the like is referred to as a marker region in some cases hereinafter.
- the marking may be performed with a color pen of opaque color such as a red ball-point pen. In this case, characters remain clearly visible as long as the target characters are encircled rather than painted.
- the image receiving module 110 transfers the image to the definition reducing module 120 and the graying module 160 . Then, the image transferred to the definition reducing module 120 is used for generating a mask image, while the image transferred to the graying module 160 is used for generating a clean copy of the original image in the part other than the marker region.
- the definition reducing module 120 is connected to the image receiving module 110 and the color region extracting module 130 . Based on an image received from the image receiving module 110 , the definition reducing module 120 generates an image having a lower definition than the received image. Then, the low definition image is transferred to the color region extracting module 130 .
- Employable methods of generating the image of low definition include: smoothing (such as feathering; more specifically, filtering such as equalization filtering and median filtering); the processing of reducing the definition (reduction processing); and the processing of performing irreversible compression at a high compression ratio and then performing enlargement.
- the color region extracting module 130 is connected to the definition reducing module 120 and the mask image generating module 140 .
- the color region extracting module 130 extracts a color region from a low definition image generated by the definition reducing module 120 . Then, the extracted color region is transferred to the mask image generating module 140 .
- the color region indicates a region marked as a to-be-masked region by a user.
- the extraction of a color region is performed by extracting pixels having saturation greater than or equal to a value determined in advance.
- the mask image generating module 140 is connected to the color region extracting module 130 and the attribute communizing module 150 . Based on a color region extracted by the color region extracting module 130 , the mask image generating module 140 convert a part of the image received by the image receiving module 110 into a mask region, and generates a mask image including the mask region. Then, the mask image is transferred to the attribute commonizing module 150 .
- the conversion to the mask region is performed by converting the color region into black or the like. Alternatively, when the color region has a shape surrounding a region, the surrounded region including the color region itself is converted into black.
- the attribute commonizing module 150 is connected to the mask image generating module 140 and the image combining module 170 .
- the attribute commonizing module 150 communizes the attribute of the image.
- the commonized mask image is transferred to the image combining module 170 .
- the communization indicates the processing of changing the attribute of one image such as to be equal to the attribute of the other image.
- the attribute of the image to be commonized includes the resolution, the color mode (such as monochrome, gray, and color), and the like. That is, in a case that the definition reducing module 120 has reduced the resolution, the resolution and the like are restored in this processing.
- the graying module 160 is connected to the image receiving module 110 and the image combining module 170 , and generates a clean copy of the color image received from the image receiving module 110 . Then, the image is transferred to the image combining module 170 .
- the clean copy generation includes gray image generation.
- the gray image generation indicates conversion of a color image into a monochrome image including a gray image. That is, for example, the image may be converted into a YCrCb image, and then its Y component alone may be extracted so that an image may be generated. Further, the clean copy generation processing may be binarization other than the gray image generation, or alternatively null processing may be performed (that is, the graying module 160 may be omitted).
- the image combining module 170 is connected to the attribute commonizing module 150 , the graying module 160 , and the image output module 180 , and combines the mask image generated by the attribute commonizing module 150 with the image received by the image receiving module 110 (including the image obtained by clean copy generation in the graying module 160 ). Then, the obtained image is transferred to the image output module 180 .
- the combination of images may be logical sum on a pixel basis, or alternatively may be conversion of the region of the received image corresponding to the black part of the mask image into white or the like.
- the image output module 180 is connected to the image combining module 170 , and receives an image generated by the image combining module 170 and then outputs the image.
- the expression “to output an image” includes “to print data through a printing apparatus such as a printer”, “to display data onto a display apparatus such as a display device”, “to transmit an image through an image transmitting apparatus such as a facsimile machine”, “to write an image into an image storage device such as an image database”, “to store data into a storage medium such as a memory card”, and “to transfer data to another information processing apparatus”. Further, the image output module 180 may output data obtained by converting an image into document data in a PDF (Portable Document Format) or the like.
- PDF Portable Document Format
- FIG. 2 is an explanation diagram showing an example of marker masking.
- This figure shows an example of masking in a practical situation in the present exemplary embodiment. Specifically, this example is a document to be under information disclosure but containing personal information and the like. Thus, such information need be concealed.
- the document 200 is such an object document.
- a marker pen 205 Using a marker pen 205 , a user paints a marker region 201 to be concealed.
- the original document 200 before the marking is not in color, and is composed of black, white, and gray parts.
- check and confirmation 210 check and confirmation is performed by a person other than the user who marked the document 200 . Since the ink color of the marker pen 205 is translucent, the check is allowed in a state that characters within the marker region 201 are visible.
- the document 200 having undergone check and confirmation is read by a scanner.
- This read-out image is used in the present exemplary embodiment.
- the present exemplary embodiment performs marker masking 230 .
- a to-be-disclosed document 240 in which the marker region 201 is concealed is obtained.
- the to-be-disclosed document 240 undergoes printing or PDF-conversion 250 , and then goes into disclosure 260 .
- the concealed region in the to-be-disclosed document 240 is blacked out.
- the region may be converted into white (that is, characters and the like within the region may be deleted). Further, the painting may be performed in another color.
- FIGS. 3A , 3 B and 3 C are explanation diagrams for showing an example of an image having undergone masking.
- a marker region 310 a is painted with a marker pen by a user.
- the inside of the marker region 310 a is, for example, in fluorescent yellow.
- the marker region 310 a is changed into a black masking region 310 b in the present exemplary embodiment.
- the marker region 310 a is changed into a white masking region 310 c in the present exemplary embodiment.
- FIGS. 4A and 4B are explanation diagrams for showing an example of an image having undergone masking. In this example, a color region is simply extracted and then concealed without using the technique of the present exemplary embodiment.
- the regions 411 a , 412 a , and 413 a are marked parts, and hence are painted with a marker pen in three kinds of blue, respectively.
- the regions 421 a , 422 a , 423 a , 431 a , 432 a , and 433 a are regions of original image, and are in gray of six kinds of densities, respectively.
- a masking image 400 b illustrated in FIG. 4B is obtained as a result.
- the entirety of the regions 411 a , 412 a , and 413 a are extracted as color regions and concealed white. Nevertheless, despite that the region 421 a and the like are original image parts and not color regions, these regions are whitened non-uniformly like the region 421 b illustrated in FIG. 4B . This may be caused by sensitivity non-uniformly in the CCD (Charge Coupled Device) sensor, noise addition by irreversible compression, and the like. That is, despite that a part not painted with a marker pen is intrinsically in black, gray, and white, a color region is generated to a certain small extent. Then, such a region is also concealed, so that a patchy image is obtained as illustrated in FIG. 4B .
- CCD Charge Coupled Device
- FIG. 5 is an explanation diagram showing an example of an image having undergone masking. Similarly to FIGS. 4A and 4B , in this example, a color region alone is extracted and then concealed without using the technique of the present exemplary embodiment.
- the original image 500 a illustrated in Part (a) of FIG. 5 is a document that contains characters and the like. Here, this example shows only a region in which painting with a marker pen is not performed.
- the region 510 b illustrated in Part (b) of FIG. 5 is enlargement of a part of the region 510 a in the original image 500 a of Part (a) of FIG. 5 .
- color other than black is generated by the above-mentioned reasons.
- FIGS. 6A , 6 B, 6 C and 6 D are explanation diagrams for showing an example of marker masking according to the first exemplary embodiment.
- the image receiving module 110 receives the original image 600 a illustrated in FIG. 6A .
- a marker region 610 a painted
- a marker region 620 a encircled
- the marker regions 610 a and 620 a are in translucent blue.
- the graying module 160 converts the original image 600 a into a graying-processed image 600 b illustrated in FIG. 6B .
- the graying-processed image 600 b has a marker region 610 b and a marker region 620 b whose color has been changed into gray.
- the mask image generating module 140 From the marker regions 610 a and 620 a which are color regions extracted by the color region extracting module 130 , the mask image generating module 140 generates a mask image 600 c illustrated in FIG. 6C . That is, from the marker region 610 a , a masking region 610 c where translucent blue has been converted into black is generated. Further, from the marker region 620 a , a masking region 620 c where a region surrounded by a translucent blue line has been converted into black is generated. After that, the attribute commonizing module 150 commonizes the attribute (resolution and the like) of the mask image 600 c such that combination with the graying-processed image 600 b is allowed. Then, the image combining module 170 combines the images 600 b and 600 c into an image 600 d shown in FIG. 6D .
- the image output module 180 outputs the masking image 600 d illustrated in FIG. 6D .
- the masking regions 610 d and 620 d are masked by the corresponding marker regions 610 a and 620 a.
- FIGS. 7A , 7 B and 7 C are explanation diagrams for showing an example of an image having undergone masking according to the first exemplary embodiment.
- FIGS. 7A and 7B correspond to FIGS. 4A and 4B , respectively.
- the result of processing of the present exemplary embodiment performed on the original image 400 a illustrated in FIG. 7A is the masking image 700 c illustrated in FIG. 7C . That is, if masking were simply performed on a color region, a patchy image would be obtained even in regions not pained out with a marker pen like in the masking image 400 b illustrated in FIG. 7B .
- the regions 411 a , 412 a , and 413 a painted with a marker pen are masked like the regions 711 c , 712 c , and 713 c .
- the regions 421 a , 422 a , 423 a , 431 a , 432 a , and 433 a of the original image part remain intact like the regions 721 c , 722 c , 723 c , 731 c , 732 c , and 733 c.
- FIG. 8 is a conceptual module configuration diagram for an exemplary configuration according to a second exemplary embodiment.
- like modules to those in the first exemplary embodiment are designated by like numerals, and hence duplicated description is omitted.
- the modules described below add operation and functions to those in the first exemplary embodiment, or alternatively replace their operation and functions.
- the second exemplary embodiment comprises: an image receiving module 110 , a color region extracting module 815 , a noise/marker determination module 820 , a noise removing module 825 , an isolated point removing module 830 , a mask image generating module 140 , an attribute communizing module 150 , a graying module 160 , an image combining module 170 , and an image output module 180 .
- the image receiving module 110 is connected to the color region extracting module 815 and the graying module 160 , and transfers a received image to the color region extracting module 815 and the graying module 160 .
- the image transferred to the color region extracting module 815 is used for generating a masking image.
- the image transferred to the graying module 160 is used for generating a clean copy of the original image in the part other than the marker region.
- the color region extracting module 815 is connected to the image receiving module 110 and the noise/marker determination module 820 , and extracts a color region from an image received by the image receiving module 110 . Then, the extracted color region is transmitted to the noise/marker determination module 820 .
- the color region indicates a region marked as a to-be-masked region by a user.
- the extraction of a color region is performed by extracting pixels having saturation greater than or equal to a value determined in advance. At this stage, in some cases, regions other than a region marked by a marker pen or the like are extracted also as color regions.
- the noise/marker determination module 820 is connected to the color region extracting module 815 and the noise removing module 825 . Then, based on the lightness of the color region extracted by the color region extracting module 815 , the lightness of the peripheral region of the color region extracted by the color region extracting module 815 , or the hue of the color region extracted by the color region extracting module 815 , the noise/marker determination module 820 determines whether the color region is to be adopted as a mask region for masking a part of the image. Then, the determination result is transferred to the noise removing module 825 .
- the lightness of a region is calculated from the lightness values of the pixels within the region, and is expressed by the average, the mode, or the like of the lightness values of the pixels.
- an employable method of determination is that based on comparison with a value defined in advance. In the comparison between the lightness and the value defined in advance, it is sufficient that a dark one is determined as not a masking region. In a case that the determination is performed based on the hue of the color region, the determination may be, for example, based on comparison between a deviation (variation) in the hue within the color region and a value defined in advance. This is because an intrinsic marker region is of single color whereas a region that is a color region but not a marker region is frequently of plural color.
- the noise/marker determination module 820 performs determination based on the lightness of the peripheral region of the color region
- the determination may be based on comparison between the lightness and a value corresponding to the resolution of the received image or a pixel group in the peripheral region. This is because in a case that a pixel group such as characters is marked with a marker pen or the like, a possibility arises that the region is determined as not a masking region in the above-mentioned determination. Thus, the above-mentioned comparison is employed for avoiding such determination. That is, the peripheral region is generated by expanding the color region by the number of pixels defined in advance.
- the number of pixels for expansion is adjusted in accordance with the resolution. Further, the size and the density of the pixel group are measured so that the influence of characters mixed into the peripheral region is predicted in advance. Then, from the peripheral region, pixels of low lightness such as characters are removed by a number define in advance. After that, the lightness may be calculated from the background and the marker region having been remained. Alternatively, pixels of a lightness lower than or equal to a threshold value defined in advance may be extracted in each peripheral region, and then the entire lightness may be calculated after the pixels are removed.
- the noise/marker determination module 820 may perform determination based on a combination of any two or more of the lightness of the color region, the lightness of the peripheral region of the color region, and the hue of the color region.
- FIG. 9 is an explanation diagram showing an example of a region that has been determined as a color region but that is not a masking region.
- Each square shown in FIG. 9 indicates a pixel.
- the region 910 surrounded by black pixels is an example of a region that is not a marker region but that has been extracted as a color region. Even in a case that all pixels are intrinsically black, in some cases, the pixels become colored like this owing to the sensitivity non-uniformly in the sensor or the like.
- the lightness of the pixels in the region 910 is dark because these pixels are intrinsically black ones.
- the noise/marker determination module 820 determines that pixels darker than a value defined in advance do not constitute a masking region.
- a region marked with a marker pen or the like is of white region in many cases, and a darker region is frequently extracted as a color region.
- a region marked with a marker pen or the like is of white region in many cases, and a darker region is frequently extracted as a color region.
- FIG. 10 is an explanation diagram showing an example of pixels to be determined as a masking region.
- Each square shown in FIG. 10 indicates a pixel.
- the region 1010 surrounded by white pixels is an example of a region extracted as a color region.
- the region 1010 intrinsically belongs to the surrounding color region, but is separated like this owing to irreversible compression and the like.
- the lightness of the pixels in the region 1010 is bright because the ink color of the marker pen or the like is translucent and the intrinsic color of the background is white.
- the noise/marker determination module 820 determines that pixels brighter than a value defined in advance constitute a masking region.
- a region marked with a marker pen or the like is of white region in many cases.
- the pixels in the peripheral region of the region 1010 are brighter than a value defined in advance, it is determined that the region 1010 is a masking region.
- the noise removing module 825 is connected to the noise/marker determination module 820 and the isolated point removing module 830 , and removes a region (color region determined that it should not consider as the mask image) determined as a noise by the noise/marker determination module 820 . That is, marker regions alone are maintained. Then, each marker region is transferred to the isolated point removing module 830 .
- the isolated point removing module 830 is connected to the noise removing module 825 and the mask image generating module 140 , and removes isolated points from the color region determined to be adopted as a mask image by the noise/marker determination module 820 (that is, the color region where noises have been removed by the noise removing module 825 ). Then, the marker region where isolated points have been removed is transferred to the mask image generating module 140 .
- this processing when coloring occurs on a black character or a line, in a case that the noise/marker determination module 820 performs determination based on the hue, no variation in the hue arises in some cases if the extracted color region is small.
- the processing is employed in order to remove such small regions (isolated points). Accordingly, when the noise/marker determination module 820 performs determination based on the lightness of a color region, the processing performed by the isolated point removing module 830 may be omitted.
- the isolated point indicates a color region having an area smaller than or equal to a value defined in advance.
- the mask image generating module 140 is connected to the isolated point removing module 830 and the attribute commonizing module 150 .
- the mask image generating module 140 converts the color region determined to be adopted as a mask region by the noise/marker determination module 820 into a mask region, and generates the mask image including the mask region.
- the mask image generating module 140 converts the color region where isolated points have been removed by the isolated point removing module 830 into a mask region, and generates a mask image including the mask region.
- FIG. 11 is a conceptual module configuration diagram for an exemplary configuration according to a third exemplary embodiment.
- the third exemplary embodiment is a combination of the first exemplary embodiment and the second exemplary embodiment.
- like modules to those in the first and the second exemplary embodiments are designated by like numerals, and hence duplicated description is omitted.
- the modules described below add operation and functions to those in the first and the second exemplary embodiments, or alternatively replace their operation and functions.
- the third exemplary embodiment includes: an image receiving module 110 , a definition reducing module 120 , a color region extracting module 130 , a noise/marker determination module 820 , a noise removing module 825 , an isolated point removing module 830 , a mask image generating module 140 , an attribute commonizing module 150 , a graying module 160 , an image combining module 170 , and an image output module 180 .
- the color region extracting module 130 is connected to the definition reducing module 120 and the noise/marker determination module 820 .
- the color region extracting module 130 extracts a color region from a low definition image generated by the definition reducing module 120 . Then, the extracted color region is transmitted to the noise/marker determination module 820 .
- the noise/marker determination module 820 is connected to the color region extracting module 130 and the noise removing module 825 . Then, based on the lightness of the color region extracted by the color region extracting module 130 , the lightness of the peripheral region of the color region extracted by the color region extracting module 130 , or the hue of the color region extracted by the color region extracting module 130 , the noise/marker determination module 820 determines whether the color region is to be adopted as a mask region for masking a part of the image.
- FIG. 12 An example of hardware configuration of the image processing apparatus of the above-mentioned exemplary embodiment is described below with reference to FIG. 12 .
- the configuration shown in FIG. 12 is constructed from a personal computer (PC) or the like.
- This hardware configuration has: a data reading section 1217 such as a scanner; and a data output section 1218 such as a printer.
- the CPU (Central Processing Unit) 1201 is a control section for executing the processing according to a computer program that describes the execution sequence of the various kinds of modules described in the above-mentioned exemplary embodiments, that is, the definition reducing module 120 , the color region extracting module 130 , the color region extracting module 130 , the mask image generating module 140 , the attribute commonizing module 150 , the graying module 160 , the image combining module 170 , the color region extracting module 815 , the noise/marker determination module 820 , the noise removing module 825 , the isolated point removing module 830 , and the like.
- the definition reducing module 120 the color region extracting module 130 , the color region extracting module 130 , the mask image generating module 140 , the attribute commonizing module 150 , the graying module 160 , the image combining module 170 , the color region extracting module 815 , the noise/marker determination module 820 , the noise removing module 825 , the isolated point removing module 830
- the ROM (Read Only Memory) 1202 stores programs, calculation parameters, and the like used by the CPU 1201 .
- the RAM (Random Access Memory) 1203 stores: programs used in the execution by the CPU 1201 ; parameters that vary in accordance with the execution; and the like. These units are connected to each other through a host bus 1204 constructed from a CPU bus and the like.
- the host bus 1204 is connected through the bridge 1205 to the external bus 1206 such as a PCI (Peripheral Component Interconnect/Interface) bus.
- PCI Peripheral Component Interconnect/Interface
- the keyboard 1208 and the pointing device 1209 are input devices operated by an operator.
- the display 1210 is constructed from a liquid crystal display or a CRT (Cathode Ray Tube), and displays various kinds of information in the form of a text and image information.
- the HOD (Hard Disk Drive) 1211 has a hard disk in the inside, and drives the hard disk so as to record or reproduce programs and information executed by the CPU 1201 .
- the hard disk stores images, mask images, and the like having been received. Further, various kinds of computer programs, like various kinds of data processing programs other than those described above, are stored.
- the drive 1212 reads out data or a program recorded on the presently-attached removable recording medium 1213 such as a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory. Then, the data or the program is provided to the RAM 1203 connected through the interface 1207 , the external bus 1206 , the bridge 1205 , and the host bus 1204 .
- the removable recording medium 1213 may be used also as a data recording region similarly to the hard disk.
- the connection port 1214 is a port to which an external connection device 1215 is connected, and has connection sections of USB, IEEE 1394, and the like.
- the connection port 1214 is connected to the CPU 1201 and the like through the interface 1207 , the external bus 1206 , the bridge 1205 , the host bus 1204 , and the like.
- the communication section 1216 is connected to the network, and executes data communication with the outside.
- the data reading section 1217 is constructed from a scanner or the like, and executes document reading.
- the data output section 1218 is constructed from a printer or the like, and executes document data output.
- the hardware configuration of the image processing apparatus shown in FIG. 12 is illustrative.
- the above-mentioned exemplary embodiment is not limited to the configuration shown in FIG. 12 , and may be another one as long as the modules described in the above-mentioned exemplary embodiment are implemented.
- a part of the modules may be constructed from dedicated hardware (such as an application specific integrated circuit (ASIC)).
- ASIC application specific integrated circuit
- a mode may be employed that a part of the modules are located in an external system and connected through a communication line.
- a plurality of systems like that shown in FIG. 12 may be connected to each other through a communication line, and operate in a cooperative manner.
- the present configuration may be incorporated in a copying machine, a facsimile machine, a scanner, a printer, a combined machine (an image processing apparatus having any two or more functions of a scanner, a printer, a copying machine, a facsimile machine, and the like), or the like.
- a module in an exemplary embodiment may be incorporated into another exemplary embodiment, or may replace a module in an exemplary embodiment).
- a technique described in the section of background art may be employed as the contents of processing of a module.
- the expressions “greater than or equal to”, “smaller (lower) than or equal to”, “greater than”, and “smaller (lower) than” may be replaced by “greater than”, “smaller (lower) than”, “greater than or equal to”, and “smaller (lower) than or equal to”, as long as conflict does not arise in the combination.
- the value defined in advance in each determination may be independent of each other. Thus, these values may be different from each other, or alternatively may be the same.
- each program described above may be regarded as an invention in a “computer-readable recording medium that carries a program”.
- a “computer-readable recording medium that carries a program” indicates a computer-readable recording medium that carries a program and is used for installation and execution of a program, circulation of a program, or the like.
- employable recording media include: a digital versatile disk (DVD) such as a DVD-R, a DVD-RW, and a DVD-RAM according to the standard set forth by the DVD Forum; a DVD+R, a DVD+RW, and the like set forth by DVD+RW; compact disks (CDs) such as a read-only memory (CD-ROM), a CD recordable (CD-R), and a CD rewritable (CD-RW); a Blu-ray Disc (registered trademark); a magneto-optical disk (MO); a flexible disk (FD); a magnetic tape; a hard disk; a read-only memory (ROM); an electrically erasable and programmable read only memory (EEPROM); a flash memory; and a random access memory (RAM).
- DVD digital versatile disk
- DVD-R digital versatile disk
- DVD-RW digital versatile disk
- DVD+R a DVD+RW
- CD-RW compact disks
- CDs such as a read-only memory
- the programs described above or a part of them may be saved or circulated in the form of being recorded on the recording medium.
- the programs may be transmitted by communication through a transmission medium like a wired network, a wireless communication network, or a combination of these which is used in a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), the Internet, an intranet, an extranet, or the like.
- the programs may be transmitted on carrier waves.
- each program described above may be a part of another program, or alternatively may be recorded on a recording medium together with other programs. Further, each program may be divided and recorded on a plurality of recording media. Furthermore, any recording mode such as compression and encryption may be employed as long as reproduction is available.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Editing Of Facsimile Originals (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (10)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009188321A JP5304529B2 (en) | 2009-08-17 | 2009-08-17 | Image processing apparatus and image processing program |
JP2009-188321 | 2009-08-17 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110038533A1 US20110038533A1 (en) | 2011-02-17 |
US9002103B2 true US9002103B2 (en) | 2015-04-07 |
Family
ID=43588629
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/725,056 Active 2032-04-23 US9002103B2 (en) | 2009-08-17 | 2010-03-16 | Image processing apparatus and computer readable medium |
Country Status (2)
Country | Link |
---|---|
US (1) | US9002103B2 (en) |
JP (1) | JP5304529B2 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5738259B2 (en) * | 2012-10-29 | 2015-06-17 | 株式会社沖データ | Image reading apparatus and image forming apparatus |
JP2017016527A (en) * | 2015-07-03 | 2017-01-19 | 富士ゼロックス株式会社 | Information processing apparatus and program |
JP7099272B2 (en) | 2018-11-19 | 2022-07-12 | 富士通株式会社 | Information processing equipment, network system and teaming program |
JP7293908B2 (en) | 2019-06-25 | 2023-06-20 | 株式会社リコー | Image processing device, program and latent image embedding method |
JP2020025336A (en) * | 2019-11-11 | 2020-02-13 | 富士ゼロックス株式会社 | Information processing system and program |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5113251A (en) * | 1989-02-23 | 1992-05-12 | Fuji Xerox Co. | Editing control system and area editing system for image processing equipment |
US5132786A (en) * | 1989-02-27 | 1992-07-21 | Fuji Xerox Co., Ltd. | Color converting system for image processing equipment |
JPH04335481A (en) * | 1991-05-10 | 1992-11-24 | Fujitsu Ltd | Method and device for extracting area for color image |
JPH10228536A (en) | 1997-02-17 | 1998-08-25 | Canon Inc | Image processing method |
US6125213A (en) | 1997-02-17 | 2000-09-26 | Canon Kabushiki Kaisha | Image processing method, an image processing apparatus, and a storage medium readable by a computer |
JP2000350021A (en) | 1999-06-04 | 2000-12-15 | Ricoh Co Ltd | Digital image processor |
US20020141640A1 (en) * | 2001-02-09 | 2002-10-03 | Walter Kraft | Local digital image property control with masks |
US6532020B1 (en) * | 1992-12-23 | 2003-03-11 | Microsoft Corporation | Luminance sensitive palette |
US6757428B1 (en) * | 1999-08-17 | 2004-06-29 | National Instruments Corporation | System and method for color characterization with applications in color measurement and color matching |
US6760125B1 (en) * | 1999-04-06 | 2004-07-06 | Seiko Epson Corporation | Image processing method and device |
US20050041036A1 (en) * | 2003-08-18 | 2005-02-24 | International Business Machines Corporation | Blending multiple images for high resolution display |
US20050135676A1 (en) * | 2003-11-11 | 2005-06-23 | Fuji Photo Film Co., Ltd. | Document processor |
JP2005217599A (en) | 2004-01-28 | 2005-08-11 | Oki Electric Ind Co Ltd | Image display apparatus and image display method |
US20050180645A1 (en) * | 2004-01-19 | 2005-08-18 | Fumihiro Hasegawa | Image processing apparatus, image processing program, and storage medium |
JP2005316581A (en) | 2004-04-27 | 2005-11-10 | Konica Minolta Photo Imaging Inc | Image processing method, image processor and image processing program |
JP2006262050A (en) | 2005-03-17 | 2006-09-28 | Konica Minolta Business Technologies Inc | Image processor, color image forming apparatus, and image processing method |
US20060290957A1 (en) * | 2005-02-18 | 2006-12-28 | Samsung Electronics Co., Ltd. | Apparatus, medium, and method with automatic white balance control |
US20070183679A1 (en) * | 2004-02-05 | 2007-08-09 | Vodafone K.K. | Image processing method, image processing device and mobile communication terminal |
US7266250B2 (en) * | 2000-01-19 | 2007-09-04 | Xerox Corporation | Methods for generating anti-aliased text and line graphics in compressed document images |
US20080137907A1 (en) * | 2005-04-14 | 2008-06-12 | Bernhard Berlin | Method and arrangement for recognizing objects in mail item images, their position and reading their postal information |
US20100008585A1 (en) * | 2008-07-10 | 2010-01-14 | Fuji Xerox Co., Ltd. | Image processing apparatus, image processing method, computer-readable medium and computer data signal |
US20100008533A1 (en) * | 2008-07-10 | 2010-01-14 | Fuji Xerox Co., Ltd. | Image processing system and computer readable medium |
US20100134410A1 (en) * | 2006-10-02 | 2010-06-03 | Isao Tomisawa | Image display device |
-
2009
- 2009-08-17 JP JP2009188321A patent/JP5304529B2/en not_active Expired - Fee Related
-
2010
- 2010-03-16 US US12/725,056 patent/US9002103B2/en active Active
Patent Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5113251A (en) * | 1989-02-23 | 1992-05-12 | Fuji Xerox Co. | Editing control system and area editing system for image processing equipment |
US5132786A (en) * | 1989-02-27 | 1992-07-21 | Fuji Xerox Co., Ltd. | Color converting system for image processing equipment |
JPH04335481A (en) * | 1991-05-10 | 1992-11-24 | Fujitsu Ltd | Method and device for extracting area for color image |
US6532020B1 (en) * | 1992-12-23 | 2003-03-11 | Microsoft Corporation | Luminance sensitive palette |
JPH10228536A (en) | 1997-02-17 | 1998-08-25 | Canon Inc | Image processing method |
US6125213A (en) | 1997-02-17 | 2000-09-26 | Canon Kabushiki Kaisha | Image processing method, an image processing apparatus, and a storage medium readable by a computer |
US6760125B1 (en) * | 1999-04-06 | 2004-07-06 | Seiko Epson Corporation | Image processing method and device |
US6804395B1 (en) | 1999-06-04 | 2004-10-12 | Ricoh Company, Ltd. | Image separating apparatus with black isolation point removal of a character area |
JP2000350021A (en) | 1999-06-04 | 2000-12-15 | Ricoh Co Ltd | Digital image processor |
US6757428B1 (en) * | 1999-08-17 | 2004-06-29 | National Instruments Corporation | System and method for color characterization with applications in color measurement and color matching |
US7266250B2 (en) * | 2000-01-19 | 2007-09-04 | Xerox Corporation | Methods for generating anti-aliased text and line graphics in compressed document images |
US20020141640A1 (en) * | 2001-02-09 | 2002-10-03 | Walter Kraft | Local digital image property control with masks |
US6954549B2 (en) * | 2001-02-09 | 2005-10-11 | Gretag Imaging Trading Ag | Local digital image property control with masks |
US20050041036A1 (en) * | 2003-08-18 | 2005-02-24 | International Business Machines Corporation | Blending multiple images for high resolution display |
US20050135676A1 (en) * | 2003-11-11 | 2005-06-23 | Fuji Photo Film Co., Ltd. | Document processor |
US20050180645A1 (en) * | 2004-01-19 | 2005-08-18 | Fumihiro Hasegawa | Image processing apparatus, image processing program, and storage medium |
JP2005217599A (en) | 2004-01-28 | 2005-08-11 | Oki Electric Ind Co Ltd | Image display apparatus and image display method |
US20070183679A1 (en) * | 2004-02-05 | 2007-08-09 | Vodafone K.K. | Image processing method, image processing device and mobile communication terminal |
US7864198B2 (en) * | 2004-02-05 | 2011-01-04 | Vodafone Group Plc. | Image processing method, image processing device and mobile communication terminal |
JP2005316581A (en) | 2004-04-27 | 2005-11-10 | Konica Minolta Photo Imaging Inc | Image processing method, image processor and image processing program |
US20060290957A1 (en) * | 2005-02-18 | 2006-12-28 | Samsung Electronics Co., Ltd. | Apparatus, medium, and method with automatic white balance control |
JP2006262050A (en) | 2005-03-17 | 2006-09-28 | Konica Minolta Business Technologies Inc | Image processor, color image forming apparatus, and image processing method |
US20080137907A1 (en) * | 2005-04-14 | 2008-06-12 | Bernhard Berlin | Method and arrangement for recognizing objects in mail item images, their position and reading their postal information |
US7480394B2 (en) * | 2005-04-14 | 2009-01-20 | Siemens Aktiengesellschaft | Method and arrangement for recognizing objects in mail item images, their position and reading their postal information |
US20100134410A1 (en) * | 2006-10-02 | 2010-06-03 | Isao Tomisawa | Image display device |
US20100008585A1 (en) * | 2008-07-10 | 2010-01-14 | Fuji Xerox Co., Ltd. | Image processing apparatus, image processing method, computer-readable medium and computer data signal |
US20100008533A1 (en) * | 2008-07-10 | 2010-01-14 | Fuji Xerox Co., Ltd. | Image processing system and computer readable medium |
Non-Patent Citations (1)
Title |
---|
Notification of Reasons for Refusal dated Mar. 12, 2013 from Japanese Patent Application No. 2009-188321 (with English-language translation). |
Also Published As
Publication number | Publication date |
---|---|
US20110038533A1 (en) | 2011-02-17 |
JP5304529B2 (en) | 2013-10-02 |
JP2011041119A (en) | 2011-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8213748B2 (en) | Generating an electronic document with reference to allocated font corresponding to character identifier from an image | |
JP4577421B2 (en) | Image processing apparatus and image processing program | |
US8320673B2 (en) | Image processing apparatus, image processing method and computer-readable medium | |
US9002103B2 (en) | Image processing apparatus and computer readable medium | |
US11973903B2 (en) | Image processing system and image processing method with determination, for each of divided areas, as to which of read image data or original image data is used in correcting original image data | |
US11818316B2 (en) | Image processing apparatus and method for embedding specific information based on type of designated printing apparatus | |
US8310692B2 (en) | Image processing apparatus, image processing method, computer-readable medium and computer data signal | |
US9277074B2 (en) | Image processing apparatus, method, and medium determining whether image data of a page to be processed is blank and contains a foreground object and transmitting the foreground object obtained by removing a background object | |
US8503774B2 (en) | Apparatus, method and computer readable medium for performing solid-line conversion from lines having breaks | |
JP7048275B2 (en) | Image processing equipment | |
US11706365B2 (en) | Image processing apparatus that adds information indicating a copy is of an authentic document | |
JP2007068127A (en) | Image reproduction device, image reproduction method, program, and recording medium | |
JP5262778B2 (en) | Image processing apparatus and image processing program | |
US8390907B2 (en) | Image-processing device, image-forming device, image-processing method, and computer readable medium | |
JP2009060216A (en) | Image processor, and image processing program | |
JP2002252773A (en) | Color image processing method and device, and recording medium | |
JP6135349B2 (en) | Image processing apparatus and image processing program | |
JP2005094563A (en) | Image processing system | |
JP2010278710A (en) | Image processing apparatus and image processing program | |
JP2005072858A (en) | Image processing apparatus | |
JP2005311940A (en) | Apparatus, method and program for image processing | |
JPH10210305A (en) | Digital color-copying machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI XEROX CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAITO, TERUKA;REEL/FRAME:024090/0940 Effective date: 20100312 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: FUJIFILM BUSINESS INNOVATION CORP., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:058287/0056 Effective date: 20210401 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |