WO2005071939A1 - Unauthorized copy preventing device and method thereof, and program - Google Patents
Unauthorized copy preventing device and method thereof, and program Download PDFInfo
- Publication number
- WO2005071939A1 WO2005071939A1 PCT/JP2005/001176 JP2005001176W WO2005071939A1 WO 2005071939 A1 WO2005071939 A1 WO 2005071939A1 JP 2005001176 W JP2005001176 W JP 2005001176W WO 2005071939 A1 WO2005071939 A1 WO 2005071939A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- photographic paper
- lumps
- image
- unit
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03G—ELECTROGRAPHY; ELECTROPHOTOGRAPHY; MAGNETOGRAPHY
- G03G21/00—Arrangements not provided for by groups G03G13/00 - G03G19/00, e.g. cleaning, elimination of residual charge
- G03G21/04—Preventing copies being made of an original
- G03G21/046—Preventing copies being made of an original by discriminating a special original, e.g. a bank note
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00838—Preventing unauthorised reproduction
- H04N1/0084—Determining the necessity for prevention
- H04N1/00843—Determining the necessity for prevention based on recognising a copy prohibited original, e.g. a banknote
- H04N1/00846—Determining the necessity for prevention based on recognising a copy prohibited original, e.g. a banknote based on detection of a dedicated indication, e.g. marks or the like
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00838—Preventing unauthorised reproduction
- H04N1/00856—Preventive measures
- H04N1/00875—Inhibiting reproduction, e.g. by disabling reading or reproduction apparatus
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10T—TECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
- Y10T428/00—Stock material or miscellaneous articles
- Y10T428/24—Structurally defined web or sheet [e.g., overall dimension, etc.]
- Y10T428/24802—Discontinuous or differential coating, impregnation or bond [e.g., artwork, printing, retouched photograph, etc.]
- Y10T428/24893—Discontinuous or differential coating, impregnation or bond [e.g., artwork, printing, retouched photograph, etc.] including particulate material
- Y10T428/24901—Discontinuous or differential coating, impregnation or bond [e.g., artwork, printing, retouched photograph, etc.] including particulate material including coloring matter
Definitions
- the present invention relates to an apparatus for preventing unauthorized duplication, a method thereof, and a program, and is suitably applied to a case where unauthorized duplication of the content printed on paper is prevented.
- paper is used as a printing target for various contents
- the printing paper on which the contents (hereinafter referred to as printing contents) is printed is, for example, a commodity exchange medium such as money, a content certifying medium such as a certificate or It often has high value because it functions as various media such as information storage media such as personal works.
- an object of the present invention is to propose an apparatus for preventing unauthorized duplication, a method and a program capable of appropriately protecting printed contents.
- an unauthorized duplication preventing apparatus for preventing improper duplication of print content printed on photographic paper comprising: an acquiring means for acquiring pattern information based on a pattern included in the photographic paper; A storage means for storing the pattern information obtained by the means on the photographic paper, and a verification means for verifying the validity of the photographic paper based on the pattern information stored by the storage means are provided.
- a method for acquiring pattern information based on a pattern included in photographic paper in an unauthorized duplication preventing method for preventing improper duplication of print content printed on photographic paper, a method for acquiring pattern information based on a pattern included in photographic paper
- a first step, a second step of storing the acquired pattern information on the photographic paper, and a third step of verifying the validity of the photographic paper based on the stored pattern information are provided.
- an unauthorized duplication preventing apparatus for preventing improper duplication of print content imprinted on photographic paper, an image pickup means for picking up an image on a photographic paper, and a pattern image obtained as a result of the image pickup means.
- Extracting means for dividing the pattern in the plurality of areas into a plurality of areas, and extracting the divided areas as pattern information each representing a predetermined shape; storage means for storing the pattern information extracted by the extracting means on printing paper;
- a verification means for verifying the validity of the photographic paper based on the pattern information stored by the storage means is provided.
- a first step of imaging a pattern included in the photographic paper In a pattern image obtained as a result of imaging, a pattern is divided into a plurality of regions, a second step of extracting each of the divided regions as pattern information representing a predetermined shape, and extracting the extracted pattern information.
- a third step of storing on the photographic paper and a fourth step of verifying the validity of the photographic paper based on the stored pattern information are provided.
- a second process for extracting the extracted pattern information as printed pattern information a third process for storing the extracted pattern information as photographic paper ⁇ , and a validity of the photographic paper based on the patterned information stored on the paper. 4th process to verify And have been established.
- an image pickup means for imaging a pattern included in the photographic paper, and an image taken by the image pickup means
- Extraction means for extracting the features of the resulting pattern image
- storage means for storing the features extracted by the extraction means on photographic paper; reconstructing the pattern image based on the features stored by the storage means; Verification means for verifying the validity of the photographic paper based on the composed pattern image is provided.
- a first step of capturing an image of a pattern on the photographic paper in a method for preventing unauthorized duplication of print content printed on photographic paper, a first step of capturing an image of a pattern on the photographic paper, and a method of capturing a pattern image obtained as a result of the imaging.
- a second step of extracting the features a third step of storing the extracted features on photographic paper, reconstructing a pattern image based on the stored features, and forming a photographic paper based on the reconstructed pattern image.
- a fourth step for verifying correctness is provided.
- a fourth process for verifying the validity of the photographic paper based on the image is executed.
- an unauthorized duplication preventing apparatus for preventing improper duplication of print content printed on photographic paper, comprising: a dividing means for dividing a pattern included in the photographic paper into a predetermined unit area; For each area divided by the dividing means, a plurality of points for generating a curve approximating the contour are determined based on points on the contour of the area, and these points are extracted as pattern information. Extracting means, storing means for storing the pattern information extracted by the extracting means on photographic paper, reconstructing each area from the pattern information stored by the storing means, and printing using the reconstructed areas. A verification means for verifying the validity of the paper is provided.
- a first step of dividing the pattern of the photographic paper into regions of a predetermined unit A second step of determining, for each region, a plurality of points for generating a curve that approximates the contour based on points on the contour of the region, and extracting these points as pattern information;
- a fourth step for verifying is provided.
- a first process for dividing a pattern in a pattern image obtained as an imaging result of a pattern included in a printing paper on which a predetermined printing content is printed into a region of a predetermined unit is provided to the control device.
- a fourth process for verification is provided.
- the pattern information obtained based on the pattern on the paper is stored in the paper, and the validity of the pattern is verified based on the pattern information.
- the pattern of the pattern image is divided into a plurality of areas, and these areas are respectively extracted as pattern information representing a predetermined shape and stored on the photographic paper.
- a feature of a pattern image obtained as a result of capturing a pattern of a pattern on a photographic paper on which predetermined print content is printed is extracted, and the extracted features are stored on the photographic paper.
- a pattern included in photographic paper is divided into regions of a predetermined unit, and for each of the divided regions, a curve that approximates the contour is determined based on points on the contour of the region.
- a plurality of points to be generated are determined, these points are stored on photographic paper as pattern information, and the validity of the photographic paper is verified based on the stored pattern information. Since the pattern of photographic paper can be accurately reproduced, unauthorized duplication can be easily prevented with high certainty without using special paper or the like, and thus the printed content can be appropriately protected.
- FIG. 1 is a schematic diagram showing a paper pattern
- FIG. 2 is a schematic diagram used to explain an illegal duplication prevention method.
- FIG. 3 is a schematic diagram for explaining the reproduction from the original photographic paper.
- FIG. 4 is a block diagram showing a configuration of an unauthorized duplication prevention device according to the present embodiment.
- FIG. 5 is a block diagram for explaining a first processing mode of the control unit according to the first embodiment. .
- FIG. 6 is a schematic diagram for explaining extraction of low frequency components.
- FIG. 7 is a schematic diagram used for explanation of image separation.
- FIG. 8 is a schematic diagram for explaining image separation based on a luminance histogram.
- Fig. 9 is a schematic diagram used to explain the division of white lumps (black lumps).
- FIG. 10 is a schematic diagram for explaining the removal of small lumps.
- FIG. 11 is a schematic diagram used for explaining the calculation of the feature amount.
- FIG. 12 is a schematic diagram showing the experimental result (1).
- FIG. 13 is a schematic diagram showing types of two-dimensional barcodes.
- FIG. 14 is a schematic diagram used to explain the collation of lumps (1).
- FIG. 15 is a schematic diagram used to explain the collation of lumps (2).
- FIG. 16 is a schematic diagram illustrating the joining or separation of lumps.
- FIG. 17 is a schematic diagram for explaining the connection of lumps.
- FIG. 18 is a schematic diagram for explaining separation of lumps.
- FIG. 19 is a schematic diagram showing the experimental result (2).
- FIG. 20 is a block diagram for describing a second processing mode of the control unit according to the first embodiment.
- FIG. 21 is a schematic diagram for explaining the calculation of a feature amount by elliptic approximation.
- FIG. 22 is a schematic diagram used for explaining the collation of lumps by elliptic approximation (1).
- FIG. 23 is a schematic diagram for explaining the collation of lumps by elliptic approximation (2).
- FIG. 24 is a schematic diagram illustrating the joining or separation of lumps.
- FIG. 25 is a schematic diagram for explaining the connection of lumps by elliptic approximation.
- FIG. 26 is a schematic diagram used to explain the coupling of lumps by elliptic approximation.
- FIG. 27 is a block diagram for explaining a third processing mode of the control unit according to the first embodiment. '
- FIG. 28 is a schematic diagram used to explain the calculation of the feature amount by the circle approximation.
- FIG. 29 is a schematic diagram showing the data size of the feature amount by the circle approximation.
- FIG. 30 is a schematic diagram for explaining the collation of lumps by circular approximation (1).
- FIG. 31 is a schematic diagram for explaining the connection of lumps by circular approximation.
- FIG. 32 is a schematic diagram illustrating the separation of lumps by circular approximation.
- FIG. 33 is a block diagram for explaining a fourth processing mode of the control unit according to the first embodiment.
- FIG. 34 is a schematic diagram for explaining the determination of the control point sequence.
- FIG. 35 is a schematic diagram used to explain the data size.
- FIG. 36 is a schematic diagram illustrating the generation of a Bezier curve.
- FIG. 37 is a schematic diagram explaining the generation of the reconstructed lumps.
- FIG. 38 is a schematic diagram showing a phase-only correlation result.
- FIG. 39 is a schematic diagram illustrating the generation of a Bezier curve according to another embodiment.
- FIG. 40 is a flowchart showing the authentication processing procedure.
- FIG. 41 is a block diagram for explaining a processing mode of the control unit according to the second embodiment.
- FIG. 42 is a schematic diagram for describing the extraction of a pattern pattern.
- FIG. 43 is a schematic diagram used for describing the detection of the minimum point and the maximum point.
- FIG. 44 is a schematic diagram showing the data size of the minimum point or the maximum point.
- FIG. 45 is a schematic diagram used to explain image reconstruction.
- FIG. 46 is a schematic diagram used for describing the Polonoy division.
- FIG. 47 is a schematic diagram for explaining the determination of the luminance value in the small area.
- FIG. 48 is a flowchart showing an image reconstruction processing procedure.
- FIG. 49 is a schematic diagram showing a low-frequency pattern image and a reconstructed low-frequency pattern image.
- FIG. 50 is a schematic diagram for describing image separation based on a luminance histogram according to another embodiment.
- paper has a unique pattern (hereinafter referred to as a pattern) composed of complex entanglement of fibers on the inside, not on the surface. As can be seen from the fact that it can be visually recognized, it can be obtained as an image (hereinafter, referred to as a pattern image) by, for example, a transmission scanner.
- a pattern image an image
- a pattern pattern (hereinafter, referred to as a pattern pattern) in the pattern image is extracted and printed on photographic paper using the pattern pattern. To prevent unauthorized duplication of printed content.
- the unauthorized duplication prevention device uses a pre-designated area (hereinafter designated as the designated area) in the original photographic paper (hereinafter referred to as the original photographic paper) OP pattern image.
- the pattern pattern possessed in the AR is extracted as authentication information of the original photographic paper OP.
- the unauthorized duplication prevention device uses the pattern pattern (hereinafter, referred to as an authentication pattern pattern) as a two-dimensional barcode (hereinafter, referred to as an authentication pattern code) BC, and the photographic paper surface of the original photographic paper ⁇ P. And print the authentication pattern on the original photographic paper OP.
- an authentication pattern pattern a two-dimensional barcode
- BC a two-dimensional barcode
- the unauthorized duplication prevention device duplicates the printing content of the printing paper on which the authentication pattern code BC is printed (hereinafter referred to as a coded printing paper) XP c.
- the pattern pattern included in the designated area AR in the pattern image of the coded paper XPc is extracted as comparison information with the authentication pattern pattern.
- the unauthorized duplication prevention apparatus uses the pattern pattern for comparison (hereinafter referred to as a comparison pattern pattern) and the authentication pattern pattern based on the authentication pattern code BC. And verify the validity of the coded print and paper XPc (the presence or absence of the original photographic paper OP).
- the unauthorized copy prevention device determines that the coded paper XPc is the valid original photographic paper OP, and Permits duplication of the print contents printed on the attached photographic paper XP c.
- the unauthorized duplication prevention device determines that the coded paper XP c is not the original photographic paper OP but the copied photographic paper, and Reproduction of the print content printed on the photographic paper with code XP c is prohibited.
- the original photographic paper OP can copy the print contents without limitation, but as shown in Fig. 3, the copy photographic paper on which the print contents are copied is placed in the designated area AR. Since the pattern pattern possessed is not duplicated, the duplicated photographic paper cannot duplicate the photographic content (original photographic content) at all.
- reference numeral 1 denotes an overall configuration of an unauthorized duplication prevention device according to the present embodiment.
- a control unit 2 for controlling the entire unauthorized duplication prevention device 1 is connected to a scanner unit 4 and a bus unit 3 via a bus 3. It is configured by connecting the printer unit 5.
- the control unit 2 includes a central processing unit, a work memory, and an information storage memory.
- the information storage memory stores position information (hereinafter, referred to as an area) of a designated area AR (FIG. 2) for each of various standard sizes of paper. Position information), for 2D barcode And various information such as character string information (hereinafter referred to as “code sentence, character string information”).
- the control unit 2 executes various processes according to the program loaded in the work memory, using various information stored in the information storage memory as appropriate.
- control unit 2 sends a pattern image reading command to the scanner unit 4 when a predetermined command for printing the authentication pattern code BC '(FIG. 2A) is given from the operation unit (not shown). Send out.
- the control unit 2 receives pattern image data D1 (hereinafter referred to as original pattern image data) on the original photographic paper OP (FIG. 2A) from the scanner unit 4. If given, transition to the first mode (hereinafter, this mode is called the code printing mode).
- original pattern image data hereinafter referred to as original pattern image data
- the code printing mode transition to the first mode
- the control unit 2 extracts the authentication pattern pattern from the pattern image of the original pattern image data D1, and converts the authentication pattern pattern into character string data for a two-dimensional bar code (hereinafter referred to as the authentication pattern). It is generated as D2 and sent to the printer unit 5.
- the authentication pattern code data D2 is printed on the original photographic paper OP (FIG. 2 (A)) as the authentication pattern code BC (FIG. 2 (A)) in the printer unit 5.
- control unit 2 When a predetermined duplication command is given from the operation unit, the control unit 2 sends a pattern image reading command and a code reading command to the scanner unit 4.
- control unit 2 receives, as a response result of these commands, the pattern image data on the coded paper XPc (FIG. 2 (B)) from the scanner unit 4 (hereinafter referred to as code-added pattern image data). If D3 and the authentication pattern code data D2 obtained by reading the authentication pattern code BC (FIG. 2 (A)) printed on the photographic paper XPc with the code, the second Mode (hereinafter referred to as verification mode). .
- control unit 2 extracts a comparison pattern pattern from the pattern image of the code-added pattern image data D3, and compares this comparison pattern pattern with the authentication pattern code data D2. Is collated with the authentication pattern pattern based on.
- control unit 2 generates a copy permission command only when a match rate higher than a predetermined match rate is obtained, and sends it to the scanner unit 4.
- the printed content of the coded paper X Pc (FIG. 2B) is read by the scanner unit 4, and the printed content is printed by the printer 5.
- control unit 2 prints the authentication pattern pattern extracted from the original photographic paper ⁇ P on the original photographic paper OP as the authentication pattern code BC, and authenticates the printed authentication pattern code BC. Only the coded paper XPc having the comparative pattern pattern that matches the pattern pattern can be permitted to copy the print content.
- the scanner unit 4 has a transmission mode, a reflection mode, and a code reading mode.
- the transmission mode is set.
- the code reading mode is executed when a code reading command is given.
- the scanner unit 4 irradiates light to the original photographic paper P or the coded paper XP c placed on the platen and transmits the photographic paper OP or XP c.
- the pattern projection light obtained by the above is imaged on a solid-state imaging device via an optical system.
- the scanner unit 4 performs an AZD (Ana1og / Digita1) conversion process or the like on the pattern image signal obtained from the solid-state imaging device, and obtains the original pattern image data D1 or D1 obtained as a result.
- the code-added pattern image data D 3 is sent to the control unit 2.
- the scanner unit 4 irradiates the original photographic paper OP placed on the document table with light, and reflects the print content reflected light obtained by reflecting the photographic paper OP via an optical system. An image is formed on a solid-state imaging device. Then, the scanner unit 4 performs an AZD conversion process or the like on the print content image signal obtained from the solid-state imaging device, and sends the print content image data D4 obtained as a result to the printer unit 5. Further, the scanner unit 4 is connected to the scanner unit 4 in the code reading mode. The two-dimensional code reader 4a is activated, and the authentication pattern code data D2 supplied by being read by the two-dimensional code reader 4a is sent to the control unit 2.
- the scanner unit 4 can read the pattern image, the authentication pattern code BC (FIG. 2), or the print content by executing the mode according to the various commands given from the control unit 2. It has been made.
- the printer unit 5 stores the font information for the two-dimensional code (hereinafter, referred to as code font information) and the position information of the authentication pattern code BC (FIG. 2) for each of various standard sizes of paper (hereinafter, referred to as “code font information”). Are referred to as code position information), etc. are stored in the internal memory, and the printing process is executed using these information as appropriate.
- code font information the font information for the two-dimensional code
- code position information position information for each of various standard sizes of paper
- the printing paper original printing paper OP set on the printing paper base at this time is moved to a predetermined position.
- the authentication pattern code BC (Fig. 2 (A)) will be printed.
- the printer unit 5 performs pulse width modulation processing or the like on the print content image data D 4, and prints the resulting print data on the print head unit. To send to.
- the printing content of the original printing paper OP is duplicated on the paper set on the printing paper base at this time.
- the printer unit 5 prints the authentication pattern code BC (FIG. 2) based on the authentication pattern code data D2 supplied from the control unit 2, and prints the printing content based on the image data D4. Has been made so that it can be replicated. Control unit processing.
- control unit processing will be specifically described separately for the first embodiment and the second embodiment. Control unit processing according to the first embodiment
- processing of the control unit 2 according to the first embodiment includes first to fourth processing modes, the processing modes will be sequentially described.
- a pattern image of a low-frequency component (hereinafter, referred to as a low-frequency pattern image) is obtained from the pattern image.
- the low-frequency component extraction unit 11 to be extracted, the low-frequency pattern image is a low-luminance component image (hereinafter, referred to as a white component pattern image), and the high-luminance component image (hereinafter, a black component pattern)
- An image separating unit 12 that separates the image into a plurality of regions; a region dividing unit 13 that divides a pattern included in the white component pattern image and the black component pattern image into a plurality of regions;
- control unit 2 applies a low-frequency component extraction unit 11, an image separation unit 12, and an area division unit 13 to the original pattern image data D 1 provided from the scanner unit 4.
- Various processes are sequentially performed via the pattern pattern extraction unit 14 and the two-dimensional code conversion unit 15, and the authentication pattern code data D 2 obtained as a result is sent to the printer unit 5.
- control unit 2 controls the low-frequency component extraction unit 11, the image separation unit 12, the area division unit 13, and the pattern with respect to the code-added pattern image data D 3 provided from the scanner unit 4. After various processes are sequentially performed through the pattern extracting unit 14, a matching process based on the processing result and the authentication pattern code data D 2 given from the scanner unit 4 is performed in the matching unit 16. ing.
- the low frequency component extraction processing by the low frequency component extraction unit 11 and the image separation unit Image separation processing by 12; area division unit; area division processing by 13; pattern pattern extraction processing by pattern extraction unit 14; 2D code conversion processing by 2D code conversion unit 15 The collation processing by the collation unit 16 will be described in detail.
- the low-frequency component extraction unit 11 determines the designated area from the pattern image of the original photographic paper ⁇ P (FIG. 2 (A)) or the coded paper XPc (FIG. 2 (B)).
- Pattern image of AR (Fig. 2) (hereafter referred to as area pattern image) IM1 (Fig. 6
- the low-frequency component extraction unit 11 extracts the area from the original pattern image data D1 or the code-added pattern image data D3 given from the scanner unit 4 based on the area position information stored in the internal memory.
- the data of the pattern image IM1 is acquired, and the data of the acquired area pattern image IMl is subjected to Fourier transform processing to generate frequency component data.
- the low-frequency component extraction unit 11 performs an inverse Fourier transform process on the data of this frequency component after setting the data value of the high-frequency component equal to or greater than a predetermined threshold to “0”, thereby obtaining a low-frequency component pattern image IM. 2 (hereinafter referred to as low-frequency pattern image data) D11 is generated and sent to the image separation unit 12 In this manner, the low-frequency component extraction unit 11 By extracting the low-frequency component pattern image IM2, it is possible to remove various noise components generally included in high-frequency components of an image, such as noise of a solid-state imaging device in the scanner unit 4. RU
- the low-frequency component extraction unit 11 can avoid a decrease in the accuracy of pattern pattern (feature) extraction by the pattern pattern extraction unit 14 due to various noise components. As a result, the reliability of the collation result of the collation processing in the collation unit 16 can be improved.
- Image separation processing
- the image separation unit 12 converts the low-frequency component pattern image IM2 (FIG. 7 (A)) extracted by the low-frequency component extraction unit 11 into a white component pattern image WI M (Fig. 7 (B)) and a black component pattern image BIM (Fig. 7 (C)).
- the image separation unit 12 sequentially detects the luminance value of the low-frequency component pattern image IM2 of the low-frequency pattern image data D11 supplied from the low-frequency component extraction unit 11 for each pixel, Pixels other than pixels (hereinafter, referred to as white pixels) having a luminance value equal to or less than a predetermined low luminance threshold (hereinafter, referred to as a white threshold) are converted to the highest luminance level.
- a predetermined low luminance threshold hereinafter, referred to as a white threshold
- the white component pattern image WIM is referred to as data (hereinafter, referred to as white component pattern image data.) D 1 2 To send to.
- the image separation unit 12 generates pixels having a luminance value at which the detection result of the luminance value of each pixel in the low-frequency component pattern image IM2 is equal to or higher than a predetermined high luminance threshold (hereinafter, referred to as a black threshold). (Hereinafter, this is called a black pixel.)
- a black component pattern image BIM (Fig. 7 (C)) is extracted by converting the pixels other than. To the lowest luminance level. Data (hereinafter, referred to as black component pattern image data) is transmitted to the area dividing unit 13 as D13.
- the image separation unit 12 separates the white component pattern image WIM (Fig. 7 (B)) and the black component pattern image BIM (Fig. 7 (C)) to reduce the complexity of the pattern. The degree has been reduced.
- the image separation unit 12 can prevent the pattern pattern extraction unit 14 from deteriorating the accuracy of pattern pattern (feature) extraction due to the large degree of complexity. In other words, the reliability of the matching result of the matching process in the matching unit 16 can be improved.
- the image separation unit 12 further includes a white component pattern image WIM (FIG. 7 (B)) and a black component pattern image BIM (FIG. 7) for the low-frequency component pattern image IM2 (FIG. 7 (A)).
- WIM white component pattern image
- BIM black component pattern image BIM
- C The white threshold and the black are set so that the area ratio of The threshold is adjusted.
- the image separation unit 12 sequentially detects the luminance value of the low-frequency component pattern image IM2 for each pixel, and detects the white component pattern image WIM (FIG. 7B) and the black component pattern image BIM (FIG. 7).
- (C)) is extracted, a distribution of luminance values for each pixel in the low-frequency component pattern image IM2 is generated as a luminance histogram based on the detection result, as shown in FIG.
- the image separation unit 12 determines the number of white pixels (black pixels) in the white component pattern image WIM (black component pattern image BIM) extracted at this time as a low-frequency component pattern image. It is determined whether or not 20% of all pixels in IM2 (FIG. 7A) (broken line in FIG. 8).
- the image separation unit 12 changes the white threshold (black threshold).
- the white component pattern image WIM black component pattern image BIM is extracted again based on the obtained white threshold value (black threshold value).
- the image separation unit 12 sets the white component pattern image WI such that the number of white pixels and black pixels is 20% of all pixels in the low-frequency component pattern image IM2 (FIG. 7 (A)).
- M black component pattern image BIM
- D 12 black component pattern image data D 13).
- the image separation unit 12 relatively outputs the white component pattern image WIM (FIG. 7 (B)) and the black component pattern image based on the total number of pixels of the low-frequency component pattern image IM2 (FIG. 7 (A)). Since the image BIM (Fig. 7 (C)) can be separated, even if the color tone of the photographic paper (low-frequency component pattern image IM2) changes due to, for example, aging, the change in the color tone Has been made so that it can be removed.
- the image separation unit 12 can prevent the pattern pattern extraction unit 14 from deteriorating the extraction accuracy of the pattern pattern (feature amount) due to such a change in the color tone.
- the reliability of the matching result of the matching process in Part 16 It has been made so that it can be improved.
- the area dividing unit 13 is configured to store a pattern included in the white component pattern image WIM (FIG. 7B) in units of a set of adjacent white pixels (hereinafter, referred to as white).
- the pattern in the black component pattern image BIM (Fig. 7 (C)) is divided into areas (hereinafter referred to as black lumps) in units of a set of adjacent black pixels. Divide.
- the area dividing unit 13 detects all white pixels from the white component pattern image WIM (FIG. 7B) of the white component pattern image data D12 supplied from the image separation unit 1.2. Then, as shown in Fig. 9 (A), a total of 8 pixels (4 pixels in the vertical and horizontal directions and 4 pixels in the diagonal direction) adjacent to the arbitrary target pixel AP (hereinafter, referred to as 8 neighborhood ratios) White pixels are connected sequentially.
- the area dividing unit 13 associates the identification information with the white pixel group connected so far. Form white WD WD 2 ??, and WD n .
- the area dividing unit 13 also generates a white component pattern image WIM (FIG. 7 (C)) for the black component pattern image BIM (FIG. 7 (C)) of the black component pattern image data D13 supplied from the image separation unit 12.
- WIM white component pattern image
- BIM black component pattern image
- BD black lumps
- the area dividing section 13 divides the pattern in the white component pattern image WIM (FIG. 7 (B)) into a plurality of white WDs (WD WDJ), and also sets the black component pattern image BIM (FIG. 7 (C)), the pattern can be subdivided by dividing the pattern into a plurality of black lumps BD (BDi B Dn).
- the area dividing unit 13 can analyze the patterns in the white component pattern image WIM (FIG. 7 (B)) and the black component pattern image BIM (FIG. 7 (C)) in detail. Therefore, the pattern pattern (feature) extraction accuracy in the pattern pattern extraction unit 14 Thus, the reliability of the collation result of the collation processing in the collation unit 16 can be improved.
- the area dividing unit 13 includes, as shown in FIG. 10 (A), for example, a pattern having a white component pattern image WIM (FIG. 7 (B)) having a plurality of white dummy WDs (WD WD n ). Then, as shown in Fig. 10 (B), from each of the white lumps WD, the lumps whose number of connections is less than a predetermined number (hereinafter referred to as small lumps) are removed. Then, the white dummy WD (WDt WDj) obtained as a result of the removal is sent to the pattern extracting unit 14 as data (hereinafter, referred to as white dummy data) D14.
- white dummy data data
- the area dividing unit 13 removes small lumps in the same manner as white lumps WD (WDi WDj for black lumps BD (BDi BDj) and converts black lumps BD (BDi BDn) obtained as a result of the removal into data ( Hereinafter, this is referred to as black dama data.)
- the data is transmitted to the pattern pattern extraction unit 14 as D15.
- the area dividing unit 13 applies only the pattern feature portions of the white component pattern image WIM (FIG. 7 (B)) and the black component pattern image BIM (FIG. 7 (C)) to the white WD and WD. Since the pattern can be extracted as black lumps BD, the pattern pattern (characteristic amount) extraction accuracy in the pattern pattern extraction unit 14 can be further improved.
- Pattern level feature extraction unit 14 and the feature quantity of the shape of each white lumps WD (WD 1 ⁇ WD n) and the black lumps BD (BDi BDn) to be calculated, to extract a crest-like pattern.
- the pattern pattern extraction unit 14 determines the center coordinates (x c , y J, long side 1, short side w, and long side) of each of the lumps (white lumps WD or black lumps BD). 1
- the angle ⁇ between the axis and the axis (hereinafter referred to as the rectangle information value) is calculated as a feature value.
- the pattern pattern extraction unit 14 is configured to calculate individual feature amounts for each of the white damage WDs (WDi WDj) of the white damage data D 14 supplied from the area dividing unit 13. If the luminance value of the pixels that make up the white WD is defined as I (x, y),
- the pattern pattern extraction unit 14 calculates these primary, secondary and tertiary image moments M 0 . , M 10 , M. There M 2. , M. 2. Using Mn, the center coordinates (x c , y c ) are
- the angle ⁇ is calculated as follows:
- the pattern pattern extraction unit 14 calculates the feature amount (rectangle information value) for each white WD (WD WDJ).
- the pattern pattern extraction unit 14 also performs the above-mentioned processing for the black lumps 80 (BD 1 to BD n ) of the black lumps data 01 5 supplied from the area dividing unit 13, similarly to the white lumps WD (WD t to WD).
- the features (rectangular information values) for each of the black lumps BD are calculated using equations (1) to (4).
- the white lump WD (WD 1 to WD n ) and the black lump BD (BD 1 to B
- the feature amount of each DJ (hereinafter, referred to as pattern feature amount) is a value representing the characteristic shape of the pattern included in the area pattern image IMl (FIG. 6 (A)). This means the pattern pattern extraction result itself contained in the image IMl.
- the pattern pattern extracting unit 14 uses the pattern feature data for authentication pattern data (hereinafter, referred to as authentication pattern pattern data) D 16 (FIG. 5) to the two-dimensional code conversion unit 15, and in the case of the verification mode, the pattern feature amount is compared with the pattern pattern data for comparison (hereinafter, referred to as comparison pattern pattern data). ) Is sent to the collating unit 16 as D 26 (FIG. 5).
- the pattern pattern extraction unit 14 calculates the pattern feature amount composed of the rectangular information values of each of the white lumps WD and the black lumps BD, so that the designated area AR (FIG. 2 (A)) It is possible to extract a pattern pattern (authentication pattern pattern or comparison pattern pattern) included in the above.
- the two-dimensional code conversion unit 15 stores the authentication pattern pattern (pattern feature amount) as the authentication pattern code B C (FIG. 2A) on the original photographic paper OP.
- the two-dimensional code conversion unit 15 determines the pattern feature amount of the supplied authentication pattern pattern data D 16 (rectangular information value of each white WD WD and each black BD) to a decimal point or less.
- the pattern character amount obtained as a result is subjected to a two-dimensional barcode conversion process based on the code character string information stored in the memory to generate pattern code data D 2 for authentication, and It is sent to the printer unit 5 at the timing of.
- the authentication pattern code data D2 is stored in the printer unit 5 at a predetermined position on the photographic paper (original photographic paper OP) set on the photographic paper board, as shown in FIG. )), And the authentication pattern is recorded on the original photographic paper OP (Fig. 2 (A)).
- 1 Tsunoshiro lumps or black lumps BD rectangle information values in, (the center coordinates (x c, yc), the long side 1, the short side w and the angle 0) may take data range, in FIG. 1 2 (A) Assuming the data range shown in the figure, the data size of the rectangular information value in the single white or black dummy BD was about 9 [bytes] (72 [bit]) from the experimental results.
- the pattern feature amount (each white WD and each black WD) obtained as a result of reducing redundant data by the above-described low frequency component extraction processing, image separation processing, area division processing, and pattern pattern extraction processing.
- the data size of the rectangular information value of each dummy BD was 435 [bytes] on average and 504 [bytes] at maximum.
- existing two-dimensional barcodes can convert two to three [Kbyte] binary data into two-dimensional barcodes. Therefore, even when the existing two-dimensional code is applied, the two-dimensional code conversion unit 15 performs the above-described low frequency component extraction processing, image separation processing, area division processing, and pattern pattern extraction processing. Thus, since redundant data is reduced, the authentication pattern pattern data D 16 can be appropriately converted into the authentication pattern code data D 2.
- the matching unit 16 compares the comparative pattern pattern extracted from the coded paper XPc (FIG. 2 (B)) with the authentication pattern code BC (FIG. 2 (B)).
- the original photographic paper OP stored in) is compared with the authentication pattern pattern.
- the matching unit 16 is configured to use the white WD WD and the black black BD (hereinafter referred to as black WD) represented by the pattern feature (rectangular information value) of the comparison pattern ⁇ ⁇ ⁇ pattern data D 26 supplied from the pattern pattern extraction unit 14. , And these are referred to as comparison lumps.)
- black WD the black black BD
- authentication lumps a specific matching process by the matching unit 16 will be described with reference to FIG. 14. For convenience of explanation, a matching process between one authentication dummy and a comparison dummy will be described here.
- Figure 14 shows the positional relationship of the rectangle represented by the rectangle information values (center coordinates ( xc , yJ, long side 1, short side w, and angle ⁇ between the long side 1 and the axis)).
- g r is the center coordinates of the authentication lumps (x c, y c )
- R is the rectangle of the comparison dummy (solid line)
- S is the area represented by the long side 1 and short side w of the comparison dummy
- g is the center coordinate of the comparison dummy
- X c and y ' show the centers represented by J.
- ⁇ ''and d is
- the matching unit 1 6 based on the authentication lumps and rectangular information value of the comparison lumps both central g r of the authentication lumps are present in the rectangle R of the comparative lumps, and comparison center g of use lumps determines whether present within the rectangle R r of the authentication lumps. Then, if the centers g r and g of both sides are within the rectangles R and R r of each other, the matching unit 16 compares the distance d between the centers, the inclination difference ⁇ ′ between the rectangles, and the area S r of the authentication dummy. The difference from the area of the lump S (hereinafter referred to as the lump area difference) is equal to or less than a predetermined threshold. It is sequentially determined whether it is below. ,
- the collation unit 16 determines that the authentication lumps and the comparison lumps are the same lumps if all of them are equal to or less than the threshold value. In such a case, it is determined that the authentication dummy and the comparison dummy are not the same dummy.
- the matching unit 1 as a countermeasure to prevent such erroneous determination, the ratio of the long side 1 and the short side w of the ratio of the long side 1 tau and short w r of the authentication lumps and comparative lumps If both are close to “1”, even if the inter-rectangle difference 0 r — ⁇ (that is, 6 ′ in FIG. 14) is equal to or greater than the threshold, if the difference It is determined that the comparison dummy is the same as the dummy.
- the matching ⁇ 16 is made up of a comparison pattern pattern (each comparison dummy represented by a pattern feature (rectangular information value)) extracted from the coded paper XPc (FIG. 2 (B)).
- the original photographic paper OP stored in the authentication pattern code BC (Fig. 2 (B)) is compared with the authentication pattern pattern (each authentication dummy represented by the pattern feature (rectangular information value)). It has been made.
- the matching unit 16 determines that the coded paper XPc corresponding to the comparative pattern image is a valid original photographic paper ⁇ P At this time, a copy permission command COM (FIG. 5) is generated and sent to the scanner unit 4 (FIG. 4).
- the reflection mode is executed in the scanner unit 4, and the print content of the original photographic paper OP (FIG. 2 (A)) placed on the platen at this time is sent to the printer unit 5 as print content image data D4.
- the printing content of the original printing paper OP (FIG. 2 (A)) is copied on the printing paper in the printer unit 5.
- a comparative damper based on the coded paper XP c (FIG. 2 (A)). It can be assumed that the match rate of ⁇ becomes ⁇ .
- one divided as one authentication dummy in the code printing mode is divided as two different comparison dummy in the verification mode.
- two authentication lumps separated in the code printing mode are separated as one comparison lumbar in the verification mode.
- the shape (rectangular information value) of the comparison dummy that should correspond to the authentication dummy differs, and as a result, a situation occurs when the matching rate of the comparison dummy decreases.
- the matching unit 16 performs the joining matching processing for each of the comparison dummy that does not match. And the separation and collation processing are sequentially performed.
- comparison combination lumps adjacent comparison lumps are combined with each other, and the combined lumps (hereinafter, referred to as comparison combination lumps) are matched with the corresponding authentication lumps.
- comparison lumps are separated, and the separated lumps (hereinafter, referred to as comparison separation lumps) are compared with the corresponding authentication lumps.
- join matching processing will first be specifically described with reference to FIG. 17, but for convenience of explanation, here, a comparison join is performed by joining two adjacent comparison lumps together. A description will be given of the process of combining and matching the lumbar and the authentication lumbar.
- FIG 1 7 is represented by the case and the rectangle information values DoTetsu in FIG 4 (the center coordinates (x c, y J, long side 1, the angle 0 of the short side w and the long side 1 and the shaft) It is those that shows a rectangular position relationship, Ri, R 2 is (shown in phantom) rectangular comparative lumps, gl, g 2 is represented by the center coordinates of the comparison lumps (x c, y c) R M (R r ) is the rectangle (shown by a solid line) of the comparison lumps (authentication lumps), and g f is the center coordinates (x c , y J) of the lumps for authentication. It is a thing.
- the matching unit 16 determines that the centers g and g 2 of the comparison lumps to be combined are within the rectangle R r of the authentication lumps (that is, the rectangle R M of the combined comparison lumps). Is determined, and if the center g or g 2 exists in the rectangle R r , the center of gravity G (G, y G ) is obtained between centers distance between the center of gravity G and the center g r of the authentication lumps Find d.
- the matching unit 16 determines that the comparison combining dummy which is the combining result of the separated comparison dummy and the authentication dummy are the same. We judge that it is lumps.
- the collation unit 16 combines the comparison lumps that did not match by the collation processing, and collates the combined comparison lumps again with the authentication lumps.
- This FIG. 18 is represented by rectangular information values (center coordinates (x e , y J, long side 1, short side w, and angle ⁇ between the long side 1 and the axis ⁇ ) as in FIG. that is limited to showing a rectangular position relationship
- R represents (shown in phantom) rectangular comparative lumps
- the center g is represented by the center coordinates of the comparison lumps (x c, y c)
- R s physician R S 2 (R rl , R r 2 ) is the rectangle (shown by a solid line) of the comparison dummy (authentication dummy)
- g fl is represented by the center coordinates (x c , y c ) of the authentication dummy.
- the center is shown respectively.
- G represents the center of gravity G (x G , y G ) of the comparative separation ball calculated according to equation (6)
- d represents the center of gravity G of the comparison separation ball calculated according to equation (5).
- the center-to-center distance between the centers g is shown, and the ellipses in the figure show the combined lumps for comparison and the separated lumps for comparison separated from each other.
- the matching unit 16 determines whether or not the center g r or gr 2 of the authentication lump exists in the rectangle R of the comparison lump before separation, and If g r , and g r 2 exist, the center of gravity G (, x G , y G ) of the comparison separation dummy which is the separation result of the comparison dummy is obtained, and the center of gravity G and the comparison dummy Find the distance d between the centers and the center g.
- the matching unit 16 determines that the two comparison separation lumps obtained by separating the combined comparison lumps and the authentication lumps are the same lumps.
- the collation unit 16 separates the comparison lumps that did not match by the collation processing, and collates the separated comparison lumps again with the authentication lumps.
- the collation unit 16 can eliminate the influence of the change in the imaging state by correcting the lumps so as to combine or separate the lumps adjacent to each other and then collating again. It is designed to significantly improve the reliability of
- FIG. 19 shows the experimental results.
- Process 1 when the join matching process and the separate matching process are not executed (“process 1”), when only the join matching process is executed (“process 2”), the join matching process and the separate matching process are performed.
- Process 3 the result was divided into white lumps and black lumps, and was performed 10 times.
- FIG. 20 When the processing contents of the second processing mode in the control unit 2 are functionally classified, as shown in FIG. 20 in which the same reference numerals are assigned to corresponding parts to FIG.
- the processing contents of the unit 12, the area dividing unit 13 and the two-dimensional code converting unit 15 are the same as those in the first processing mode described above, but the pattern pattern extracting unit 1 14 and the matching unit 1 16 Is different from the first processing mode.
- the pattern pattern extraction unit 1 1 4 is configured to output each white dummy WD (WD WD n) and each black dummy WD BD (BDi BDn) is different from the pattern pattern extraction unit 114 in which each white ball WD and each black ball BD are approximated in a rectangular shape in terms of approximating each ellipse R shape.
- the center coordinates (x c , y c ), the major axis rd a , the minor axis rd b, and the major axis rd a of each of the lumps (white lumps WD and black lumps BD) are shown in FIG.
- the angle ⁇ (hereinafter, referred to as ellipse information value) between the horizontal axis and the horizontal axis is calculated as a feature value.
- the pattern pattern extraction unit 114 calculates the individual feature amount for each white WD WD (WDi WDj) of the white dash data D 14 supplied from the area dividing unit 13. Assuming that the luminance value of the pixels composing the dam WD is I (X, y), the following equation (7)
- MA ⁇ the center coordinate (c ) yc) is calculated by the following equation (8)
- the angle ⁇ is calculated according to the following equation (10)
- the pattern pattern extraction unit 114 calculates the feature amount (elliptical information value) for each white WD WD (WD WDJ.: ':
- the pattern pattern extraction unit 114 also applies the black dummy BD (BD 1 to BD n ) of the black dummy data D 15 supplied from the area dividing unit 13 similarly to the white dummy WD (WD WDn).
- the feature amounts (elliptical information values) for each of the black lumps BD are calculated using the above-described equations (7) to (10).
- the characteristic amount (hereinafter, referred to as a pattern characteristic amount) of the white dummy WD (WD WDJ and black black BD (BDi BDJ)) thus calculated is a pattern included in the area pattern image I Ml (FIG. 6 (A)). Since the value represents the characteristic shape of the pattern image, it means the pattern pattern extraction result itself included in the area pattern image I Ml.
- the pattern pattern extracting unit 114 uses the pattern feature data for authentication pattern data (hereinafter referred to as authentication pattern pattern data) D 16 (FIG. 20). Is transmitted to the two-dimensional code conversion unit 15, and in the case of the verification mode, the pattern feature amount is used as data of a comparison pattern pattern (hereinafter, referred to as comparison pattern pattern data). 26 (FIG. 20) is sent to the matching unit 116.
- the pattern pattern extraction unit 114 calculates the pattern feature amount composed of the elliptical information values of each of the white lumps WD and the black lumps BD, so that the designated area AR ( The pattern pattern (authentication pattern pattern or comparison pattern pattern) shown in FIG. 2 (A) can be extracted.
- Tsunoshiro lumps ellipse information values in WD or black lumps BD (the center coordinates (x c, y J, major diameter at d a, the angle between the shorter diameter rd b and diameter rd a horizontal axis [Phi) Ri preparative is Assuming the data range shown in Fig. 12 ( ⁇ ) as the data range to be obtained, the data size of the elliptical information value in the one white WD
- the collating unit 1 16 performs comparison lumps (white lumps WD and black lumps BD) represented by pattern features (elliptical information values) of the comparison pattern pattern data D 26 supplied from the pattern pattern extraction section 114. Each of them is sequentially collated with authentication lumps (white lumps WD and black lumps BD) represented by pattern features (elliptical information values) of the authentication pattern code data D2 given from the scanner section 4.
- FIG 22 shows the ellipse information values (center coordinates (x c, y J, diameter d a, the positional relationship of the elliptical shape represented by the angle [Phi) with minor rd b and diameter rd a horizontal axis are as hereinbefore, E rd is the authentication lumps ellipse (broken line), S rd is the area represented by the major axis rd a ⁇ Pi minor rd b of the authentication lump, g rd the center coordinates of the authentication lumps (x c, the center represented by y c), E is an ellipse of the comparative lumps (solid line), S is the area represented by the major axis rd a and the minor axis rd b ratio ⁇ lumps, g is the center of the comparative lumps Coordinates (xc, y 'indicate the center represented by J. Also, d1 is given by the following equation (1 1)
- ⁇ is the angle ⁇ between the major axis rd a of the authentication dummy and the horizontal axis
- the angle ⁇ between the major axis rd a of the comparative dummy and the horizontal axis is ⁇ .
- difference i.e. the difference of the slope between the oval l3 ⁇ 4 E rd and ellipse E (hereinafter referred to as slope difference between elliptic) shows the triangle in the figure shows a comparison lumps.
- the matching unit 1 16 determines that the center g rd of the authentication dummy exists within the ellipse E of the comparison dummy, based on the ellipse information values of both the authentication dummy and the comparison dummy, and center g of the comparative lumps you determine whether present in the ellipse E rd of the authentication lumps.
- the matching unit 1 16 compares the center distance d, the elliptic slope difference ⁇ ′, and the area of the authentication dummy. It is sequentially determined whether or not the difference from the area S of the litter (hereinafter, referred to as the lump area difference) is equal to or less than a predetermined threshold.
- the collation unit 116 determines that the authentication dummy and the comparison dummy are the same if any of the thresholds are equal to or less than the threshold. In the case of the above, it is determined that the authentication dummy and the comparison dummy are not the same dummy.
- the collating unit 1 16 proposes an authentication dummy as a countermeasure to prevent such misjudgment. If the ratio of the major axis rd a and minor axis rd b of Hi ⁇ Pi comparative lumps with major axis rd a and minor axis rd b of close to both "1", the ellipse between slope difference rd - [Phi (i.e., Even if ⁇ ′) in FIG. 23 is greater than or equal to the threshold, if the difference in the lumps area is less than or equal to the threshold, the authentication lumps and the comparison lumps are determined to be the same lumps.
- the matching unit 1 16 converts each comparison pattern represented by the pattern pattern (pattern feature (elliptical information value) extracted from the coded paper XPc (FIG. 2 (B)). ) And the authentication pattern of the original photographic paper OP stored in the authentication pattern code BC (Fig. 2 (B)) (each authentication domma represented by the pattern feature (elliptical information value)). Matching is done.
- the matching unit 1 16 converts the coded paper XP c corresponding to the comparative pattern image into a valid original photographic paper OP.
- a copy permission command COM (FIG. 5) is generated and sent to the scanner unit 4 (FIG. 4).
- the reflection mode is executed in the scanner unit 4, and the print content of the original photographic paper P (FIG. 2 (A)) placed on the platen at this time is sent to the printer unit 5 as print content image data D4.
- the print content of the original photographic paper OP (FIG. 2 (A)) is copied onto the paper in the printer unit 5 in this way.
- the matching unit 116 may perform the combined matching for each non-matching comparison dummy.
- the processing and the separation / collation processing are sequentially performed.
- adjacent comparison lumps are combined with each other, and the combined lumps (hereinafter, referred to as comparison combination lumps) are collated with the corresponding authentication lumps.
- comparison combination lumps are collated with the corresponding authentication lumps.
- the merging process the comparison lumps are separated, and the separated lumps (hereinafter, referred to as comparison separation lumps) are compared with the corresponding authentication lumps.
- join matching process will be specifically described first with reference to FIG. 25.
- a comparison join damper obtained by joining two adjacent comparison damas together.
- matching verification process with the authentication dummy.
- rd is the ellipse (shown by a solid line) of the combined lumps for authentication (authentication lumps), and g rd is the center represented by the center coordinates (x c , y,) of the lumps for authentication. It is a thing.
- the matching unit 1 16 is such that the center g 2 of each of the comparison lumps to be combined is present in the ellipse E rd of the authentication lumps (that is, the ellipse E MA of the combined comparison lumps E MA ). And if the center ⁇ g 2 '' exists in the ellipse E rd , the center of gravity G (x G , y G ), and the distance d between the center of gravity G and the center g rd of the authentication ball is obtained.
- the matching unit 1 16 determines that the comparison combining dummy and the authentication dummy are the same dummy.
- the matching unit 1 16 combines the comparison lumps that did not match by the matching process, and again compares the combined comparison lumps with the authentication lumps.
- FIG. 26 is an ellipse information values as in the case of FIG. 2 2 (the center coordinates (x c, y c), the major axis at d a, the angle ⁇ between the minor diameter rd b and diameter rd a horizontal axis) by Shows the positional relationship of the ellipse represented, ⁇ represents the ellipse of the comparative dummy (shown by a broken line), g represents the center represented by the central coordinates (x c , y c ) of the comparative dummy, and E s E S 2 (E rd E rd 2 ) is the ellipse (indicated by the solid line) of the separation dummy for authentication (authentication dummy), and g rd is g r d2 is the center coordinate of the authentication dummy (X c, y c ) shows the center represented by each.
- ⁇ represents the ellipse of the comparative dummy
- the collating unit 1 16 shows the center g rd or g rd 2 of the authentication lumps. It is determined whether or not the ellipse E of the comparison dummy before separation exists, and if the center gr dl , g rd 2 exists in the ellipse E, the separation for comparison is a separation result of the dummy for comparison.
- the center of gravity G (x G , y G ) at Dama is determined, and the center distance d between the center of gravity G and the center g of the comparison dummy is determined.
- the matching unit 1 16 determines that the comparison separation dummy and the authentication dummy are the same dummy. I have.
- the collating unit 1 16 separates the comparison lumps that did not match by the collation processing, and collates the separated comparison lumps again with the authentication lumps.
- the collation unit 116 corrects the lumps by combining and / or separating the lumps adjacent to each other and then performs the collation again, thereby eliminating the influence of the change in the imaging state. It is designed to significantly improve the reliability of the results.
- the low frequency component extraction unit 11 When the processing contents of the third processing mode in the control unit 2 are functionally classified, as shown in FIG. 27 in which the same reference numerals are given to the corresponding parts in FIG. 5, the low frequency component extraction unit 11 The processing contents of the section 12, the area dividing section 13 and the two-dimensional code converting section 15 are the same as those in the first processing mode described above, but the pattern pattern extracting section 2 14 and the matching section 2 16 Is different from the first processing mode.
- the pattern pattern extraction unit 2 14 approximates each white ball WD (WDi WDj and each black ball BD (BDi BDj) to a circular shape, and approximates the white ball WD and each black ball BD to a rectangular shape. This is different from the pattern pattern extraction unit 14.
- the pattern pattern extraction unit 2 14 uses the white defect WD of the white defect data D 14 supplied from the area dividing unit 13 (Equation (7) for each WD) calculating the primary image moment MA 00 and secondary image moments MA 10, MA 01 according to.
- the pattern pattern extraction unit 214 converts the center coordinates (x c , y. C ) ′ of each white ball WD ′ into the corresponding primary and secondary image moments MA 0 . , MA 10 , and MA 01 are calculated according to equation (9), and the radius rd is calculated by the following equation (1 3)
- the pattern level feature extraction unit 2 '14 like the region separate portions 1 3 Black supplied from Damade data D 1 5 black lumps BD (BD White also BDj lumps WD (WD 1 ⁇ WD n), above (1 1), adapted to calculate the following (1 2) and (1 3), the circle information values for each said black da Ma BD (the center coordinates (x c, y c) and the radius rd) ing.
- Damade data D 1 5 black lumps BD BD White also BDj lumps WD (WD 1 ⁇ WD n), above (1 1), adapted to calculate the following (1 2) and (1 3), the circle information values for each said black da Ma BD (the center coordinates (x c, y c) and the radius rd) ing.
- the pattern data is sent to the matching unit 216 as comparison pattern data D 26 (FIG. 27).
- the pattern pattern extracting unit 214 calculates the pattern pattern in comparison with the pattern pattern extracting unit 14 (FIG. 5) for calculating the rectangular information value or the pattern pattern extracting unit 114 (FIG. 20) for calculating the elliptical information value.
- the load can be significantly reduced.
- the circle information value (center coordinates (x c , y c ) and radius rd) can take in one white ball WD or black ball BD, as is clear from the experimental results shown in Fig. 29, Assuming the same data range as in Fig. 12 (A), the data size of the circle information value in the one white WD or WD is the data size of the rectangular information value (elliptical information value) (Fig. 1 2 (A)), it was able to reduce about 24 [bit].
- the pattern pattern extracting unit 214 determines the data amount of the pattern pattern data D 16 and D 26 as can be seen by referring to the number of lumps (white lumps and black lumps) shown in FIG. It has been made so that it can be significantly reduced.
- the pattern pattern extraction unit 224 can print the two-dimensional barco KB C based on the pattern pattern data D 16 onto the original photographic paper OP more quickly. It has been made possible to further reduce the waiting time.
- the pattern pattern extraction unit 214 can perform the pattern pattern data D 16 and D 26 more quickly, so that the waiting time at the time of the pattern matching can be shortened and printing on the original photographic paper can be performed earlier. The printed content can be reproduced.
- the collating unit 2 16 includes a white dummy WD and a black dummy WD represented by the pattern feature amount (circle information value) of the comparative pattern pattern data D 26 supplied from the pattern pattern extraction unit 214.
- FIG. 30 is a circle information values (center coordinates (x c, y c), the radius rd) and shows the circular positional relationship represented by the circle of authentication lumps (dashed line), S rd Is the area represented based on the radius rd of the authentication lumps , g rd is the center represented by the center coordinates (x c , y c ) of the authentication lumps, C is the circle of the comparison lumps (solid line), S Is the area expressed based on the radius rd of the comparison dummy, and g is the center represented by the center coordinates ( Xc , yJ) of the comparison dummy.
- d2 indicates the distance between the centers of the authentication lumps and the comparison lumps calculated in the same manner as in Equation (11), and the triangle in the figure indicates the comparison lumps.
- the collation unit 2 16 determines that the center g rd of the authentication dummy exists in the circle C of the comparison dummy, based on the circle information values of both the authentication dummy and the comparison dummy. It is determined whether or not the center g of the luggage is within the circle C rd of the luggage for authentication.
- the matching unit 2 16 determines the distance d between the centers, the area S ⁇ of the authentication dummy, and the area S ⁇ of the comparison dummy. Then, it is sequentially determined whether or not the difference in the lump area is smaller than a predetermined threshold value.
- the collation unit 2 16 determines that the authentication lumps and the comparison lumps are the same lumps if all of them are equal to or less than the threshold value. In the case above, it is determined that the authentication dummy and the comparison dummy are not the same dummy.
- the matching unit 216 performs the matching process based only on the center coordinates (x c , y c ) and the radius rd, so that the process for preventing the erroneous determination described above with reference to FIG. 23 is performed. Without identification, the identity of lumps (authentication lumps and comparison lumps) This makes it possible to make a judgment, thereby reducing the load of the collation processing and the erroneous judgment rate.
- the collating unit 2 16 generates the comparison pattern (a pattern feature value (circle information value)) extracted from the coded paper XPc (FIG. 2 (B)). ) And the authentication pattern pattern (each authentication dummy represented by the pattern feature (circle information value)) of the original photographic paper OP stored in the authentication pattern code BC (Fig. 2 (B)). It has been done.
- the matching unit 2 16 determines that the coded paper XP c corresponding to the comparison pattern image is a valid original photographic paper OP. At this time, it generates the copy permission command COM and sends it to the scanner unit 4 (Fig. 4).
- the reflection mode is executed in the scanner unit 4, and the print content of the original photographic paper OP (FIG. 2 (A)) placed on the platen at this time is sent to the printer unit 5 as print content image data D4.
- the printing content of the original printing paper OP (FIG. 2 (A)) is copied on the printing paper in the printer unit 5.
- the matching unit 216 performs the first or second processing.
- the joining matching process and the separating matching process are sequentially executed for each of the comparison lumps that do not match.
- join matching processing will be described first.
- a comparison join in which two adjacent comparison lumps are joined together.
- the collation matching process between the lumbar and the authentication lumbar is described with reference to FIG.
- the 3 1 is shows the circular positional relationship represented by as in the case circle information values of FIG. 3 0 (the center coordinates (x c, y c), the radius rd), C 1 C 2 is the circle of the comparison dummy (indicated by a dashed line), gi and g 2 are the center represented by the center coordinates (x c , y c ) of the comparison dummy, and C MA (C rd ) is the joint dummy of the comparison ( Circle for authentication (Indicated by the solid line), g rd is an illustration of the center coordinates of the authentication lumps (x c, the center represented by y J respectively.
- G indicates the center of gravity G (x G , y G ) of the comparison joint lumps calculated in the same manner as in equation (1 2)
- d 2 indicates the comparison center calculated in the same manner as in equation (1 1).
- the center of gravity G of the combined lumps and the center distance between the centers g rd of the authentication lumps are shown, and the rectangle in the figure indicates the separated lumps for comparison and the combined lumps for comparison obtained by combining them. Thing.
- the matching unit 2 16 is configured such that the center g or g 2 of each of the comparison lumps to be combined is the circle C rd of the authentication lumps (that is, the circle C MA of the combined comparison lumps C MA ). And if the centers gl and g 2 exist in the circle C rd , the center of gravity G (x G , y G ) and the center distance d 2 between the center of gravity G and the center g rd of the authentication ball.
- the collating unit 2 16 determines that the comparison combining dummy and the authentication dummy are the same dummy.
- the collating unit 216 combines the comparison lumps that did not match by the collation processing, and compares the combined comparison lumps again with the authentication lumps.
- This 3 2 circle information values (center coordinates (x c, y c), the radius rd) and shows the circular positional relationship represented by, G is indicated by a circle (dashed comparative lumps ), G is the center coordinate of the comparison dummy (x c , y J), and C S 1 and C S 2 (C rd or C r . D 2 ) are the comparison isolation dummy (authentication dummy). Circles (indicated by solid lines ) and g rd and g rd 2 indicate the center coordinates (x c , y J) of the authentication lumps, respectively.
- the G is (1 2) and the comparative separating lumps which are calculated in the same manner the center of gravity G (x G, y G) indicates, d 2 is for comparison is calculated in the same manner as (1 1)
- the center-to-center distance between the center of gravity G of the separation lumps and the center g of the comparison lumps is shown.
- the rectangle in the figure shows the comparison lumps that were combined and the comparison lumps that separated them. In this FIG.
- the matching unit 2 16 determines whether or not the center g rd or g rd 2 of the authentication dummy exists within the ellipse C of the comparison dummy before separation, and determines the ellipse C If the centers g rdl , g rd 2 exist in, find the center of gravity G (x G , y G ) in the comparison separator that is the result of the comparison separator separation, and calculate this center of gravity G and the center of the comparison dummy. Find the distance d between centers with respect to g.
- the matching unit 2 16 determines that the comparison separation dummy and the authentication dummy are the same dummy. I have.
- the matching unit 2 16 separates the comparison lumps that did not match by the matching process, and again collates the separated comparison lumps with the authentication lumps.
- the collation unit 211 can correct the lumps by combining and / or separating the adjacent lumps and then collate again, thereby eliminating the influence of the change in the imaging state. It is designed to significantly improve the reliability of the results.
- the area division unit 3 13 generates the white component pattern image data D 12 supplied from the image separation unit 12 as white lumped data D 14 in both the code printing mode and the verification mode.
- the processing content is the same as that of the area dividing unit 13 in that the black component pattern image data D 13 supplied from 12 is generated as black dummy 'data D 15.
- the destination of the white dama data D14 and the black dama data D15 in the area dividing section 3 13 is different from that of the area dividing section 13.
- the area dividing unit 13 sends the white lumped data D14 and the black lumped data D15 to the pattern pattern extracting unit 14 as they are in both the code printing mode and the verification mode.
- the division section 3 13 is sent to the pattern pattern extraction section 314 in the code printing mode, and is sent to the matching section 3 16 in the verification mode.
- the pattern pattern extraction unit 3 1 4 approximates the white lumps WD and the black lumps BD to a rectangular shape in that the white lumps WD (WD WDJ and the black lumps BD (BDi BDj) approximate the respective shapes themselves. differs from the pattern level feature extraction unit 14. in practice, pattern level feature extraction section 3 1 4, for each of white lumps WD (WDi WDn) and each black lumps BD (BD 1 ⁇ BD n), on the outer circumference of the lump The control point sequence for generating a Bezier curve is determined based on the points in (hereinafter referred to as the lumbar outer peripheral points), and these control point sequences are extracted as a pattern pattern.
- the area separate portions 3 1 3 Shiroda supplied from Madeta D 1 4 white lumps WD (WD 1 ⁇ WD n) and black Damadeta D 1 5 black Da Ma BD (hereinafter referred to as the total area of lumps) is calculated based on the number of pixels, and the total area of lumps (number of pixels) stored in the internal memory in advance and the grid size of the square lattice and the Bezier curve are calculated.
- the pattern is switched to the grid size and the order of the Bezier curve corresponding to the number of pixels detected.
- the pattern pattern extraction unit 314 uses the square grid of the grid size switched at this time to specify the white AR data D 14 and the designated area AR and black dummy data.
- a white spot having a control point sequence consisting of n + lj control points in the designated area AR for generating the n-th order Bezier curve switched at this time Determined for WD and Black Lama BD.
- the pattern pattern extraction unit 3 1 4 is composed of a square lattice and white WD! Intersections P to P i 2 with the outer circumference of are recognized as control points.
- the control point P 1 is used as the first starting point, and four adjacent control points are used as control point sequences P to P 4 to P 7 and Py Pi. , P 10 to P 12 are sequentially selected.
- pattern level feature extraction section 3 1 each control points ⁇ ⁇ 4, ⁇ 4 ⁇ 7 , ⁇ 7 ⁇ 10, the end point of ⁇ 10 ⁇ 12 (control point [rho 4, [rho 7, [rho 10) the start point of the next control points (control points ⁇ 4, ⁇ 7, ⁇ is made is in so selected as 10), also the last control points ⁇ 10 ⁇ 3 control points in 12
- the control point sequence is selected as it is.
- control point sequences P i to P 4 , P 4 to P 7 , P 7 to P 10 , and P 10 to P 12 are determined as they are as the control point sequence for the white ball WD i
- the control points The Bezier curves generated from the columns P i to P 12 are inside or outside the outer periphery of the white lumps, respectively, and thus are obtained as extremely different lumps from the actual white lumps.
- the pattern pattern extraction unit 314 determines the control points between the start point and the end point in each of the control point sequences ⁇ P 4 , P 4 ⁇ P 7 , and P 7 ⁇ P 10 referred to as the control point) P 2 and P 3, P 5 and P e, and P 8 and P 9, last control points ? ⁇ ⁇ ? ⁇
- the control point sequence between the first control point sequence ⁇ and the starting point (P 1 C ), P i) (hereinafter also referred to as the intermediate control point) P 11 and P 12 are the outer periphery of the white ball WD 1 They are also shifted inward or outward.
- the pattern pattern extraction unit 314 determines the perpendicular to the line segment Pi—P 4 from the intermediate control points P 2 and P 3 and The points C 2 and C 3 corresponding to the intersection points Q 2 and Q 3 with the line segments P and P 4 are detected, and the detected points C 2 and C 3 and the control points P and P 4 are used as control points. determined as a column Pi- Cs- C 3 _P 4.
- the pattern pattern extraction unit 314 calculates the points C 5 for the other intermediate control points P 5 and P 6 , P 8 and P 9 , P ii and P x 2 in the same manner as the intermediate control points P 2 and P 3. And C 6 , c 8 and C 9 , C fatigueand C 12 , respectively, and the corresponding points C 5 and C 6 , C 8 and c g , C Behavior and C 12 and the corresponding control points P 4 and P 7 ,? 7 and? 10 ,. And P
- the pattern pattern extraction unit 314 controls the control point sequence P i -C 2 —C 3 —P 4 , P 4 _C 5 -C 6 —P 7 , P 7 —C 8 — C 9 -P 10, P 10 one c "one c 12 a is adapted to determine, respectively.
- the control point sequence for generating the third-order Bezier curve is determined in the same manner as in the case of.
- the pattern pattern extraction unit 314 converts the control point sequence of each of the white balls WD— (WD WDJ and the black balls BDi BDj) determined in this manner into authentication pattern data (hereinafter referred to as authentication pattern data). It is generated as D16 (Fig. 5) and sent to the two-dimensional code converter 15.
- pattern level feature extraction unit 3144 the selected control point sequence (P ⁇ P 4, P 4 ⁇ P 7, P 7 ⁇ P 10, P 10 ⁇ P 12) of the intermediate control points in the control point sequence ( By shifting P 2 and P 3 , P 5 and P or P 8 and P 9 , and P 12 ) inward or outward from the outer periphery of the white ball, the ball that approximates the actual white ball WD more closely.
- Control point sequence (P i— Cg— Cs— P 4 , P 4 — C 5 — C 6 — P 7
- the data size of the authentication pattern pattern data D 16 is 32 k (n + 1) [bit] from the experimental results.
- the two-dimensional code conversion unit 15 supplies the two-dimensional code from the above-described pattern pattern extraction unit 314 even when the existing two-dimensional code is applied.
- the control point sequence of the authentication pattern pattern data D 16 can be appropriately converted as the authentication pattern code data D 2.
- the verification unit 316 reads the authentication pattern code BC printed on the designated area AR of the coded paper XPc (FIG. 2 (B)) by the scanner unit 4 (authentication mode).
- the processing result of the low-frequency component extraction processing, image separation processing, and area division processing for the pattern image data D 2) and the pattern image data D 3 with code read from the designated area AR at this time (white lumped data D 14)
- the collation processing is performed based on the black and black lumps data D 15).
- the collating unit 3 16 restores the authentication pattern pattern data D 16 by performing an inverse two-dimensional code conversion process on the supplied authentication pattern code data D 2, and Based on the control point sequence of the data D 16, white lumps corresponding to the original white lumps WD (hereinafter referred to as reconstructed white lumps) and black lumps corresponding to the original black lumps BD (hereinafter referred to as Reconstructed black lumps).
- reconstructed white lumps white lumps corresponding to the original white lumps WD
- black lumps corresponding to the original black lumps BD hereinafter referred to as Reconstructed black lumps.
- a method for reconstructing the reconstructed white lumps and the reconstructed black lumps will be specifically described.
- the reconstructed white lumps corresponding to the white lumps described with reference to FIG. 34 are reconstructed. The case will be described.
- the collation unit 3 16 generates a control point sequence P “C 2 -C” for the white lumps (hatched areas surrounded by broken lines in FIG. 36) extracted by the pattern pattern extraction ⁇ 14.
- P 4 - C 5 - C 6 - P have P 7 - C 8 - C 9 - P 10, P 10 - Bezier curve B c 1 corresponding from C i-C ⁇ 2, B e 2, B e 3 and B c 4 are generated respectively.
- the matching unit 316 reconstructs the area surrounded by the Bezier curves Bc1, Bc2, Be3, and Bc4 by filling the area with a predetermined single luminance value. Generate white lumps.
- the matching unit 316 generates reconstructed white lumps corresponding to white lumps.
- the Bezier curve of the nth order has the control point as CP and the Bernsteiin function as
- the matching unit 3 1 as in the case of white lumps WD 1, based on the control points of the white lumps WD 2 ⁇ WD n in authentication A pattern code data D 2, the white lumps WD 2 ⁇
- the white lumps WD 2 In addition to generating a reconstructed white lump corresponding to WD n , based on the control point sequence for the black lump BD (BDi BDn) in the authentication pattern code data D2, the reconstructed black lump corresponding to the black lump BD is generated. It is made to generate lumps.
- the matching unit 3 1 6, having such a reconstruction white lumps were produced in, and this time realm separate portions 3 1 of 3 white Damadeta D 24 supplied from the white lumps WD (WDi ⁇ WD n)
- the phase-only correlation value C p between the extended reconstructed black lumps and the black lumps BD (BDi BDj) of the black lumps data D 25 supplied from the area dividing section 3 13 at this time is represented by
- the pixels of the reconstructed white lumps and reconstructed black lumps are RD (X, y)
- the pixels of the relevant white lumps WD and black lumps BD are D (x, y)
- the two-dimensional Fourier transform is F
- the two-dimensional inverse Fourier When the converted F 1, the following equation (1 6)
- the matching unit 316 calculates the phase-only correlation value C p equal to or less than a predetermined threshold. . Is obtained, it is determined that the coded paper XPc (FIG. 2) placed on the mounting table of the scanner unit 4 at this time is a copy, and that the copy is prohibited at this time. Is notified via the display unit (not shown) of the scanner unit 4.
- the matching unit 316 generates a phase-only correlation value C p higher than a predetermined threshold. If c is obtained, it is determined that the coded paper XPc (Fig. 2) placed on the mounting table of the scanner unit 4 at this time is a valid original photographic paper OP, and copying is permitted at this time.
- the command CAM is generated and sent to the scanner unit 4 (FIG. 4).
- the reflection mode is executed in the scanner unit 4, and the printing content of the coded paper XPc (original photographic paper OP) placed on the platen at this time is transferred to the printer unit 5 as printing content image data D4.
- the print content of the original photographic paper OP (FIG. 2A) is duplicated on the paper in the printer unit 5.
- the matching unit 316 has a feature that appears as a sharp peak in the case of the phase-only correlation (Fig. 38 (A)) when the correlation is present (Fig. 38 (A)).
- the calculated phase-only correlation result is notified via a display unit (not shown).
- the matching unit 316 can visually and easily understand the degree of the phase-only correlation result (the degree of validity).
- a plurality of control points for generating a Bezier curve approximating the outer periphery of each of the white lumps WD and the black lumps BD are extracted as a pattern pattern, and this pattern pattern is used as authentication information as the original information.
- the reconstruction corresponding to the white WD and the black BD based on the pattern stored in the coded paper XPc.
- the white lumps and the reconstructed black lumps are generated, and the validity of the original photographic paper ⁇ P is verified using the reconstructed white lumps and the reconstructed black lumps.
- the fourth processing mode a plurality of control points that are close to the outer periphery of each of the white lumps WD and the black lumps BD constituting the pattern included in the low-frequency component pattern image I M2 are extracted as a pattern.
- the pattern consisting of the reconstructed white lumps and the reconstructed black lumps generated from the control points is converted to the original low-frequency component pattern image IM2 (Fig. 6 (B) ) Can be remarkably accurately reproduced as a pattern substantially the same as the pattern possessed in the above. As a result, the verification accuracy of the validity can be improved.
- the pattern consisting of the reconstructed white lumps and the reconstructed black lumps generated from these control points is furthermore regarded as a pattern almost identical to the pattern contained in the original low-frequency component pattern image IM2 (FIG. 6 (B)). It can be reproduced with high accuracy, and as a result, the accuracy of validity verification can be further improved.
- the number of points corresponding to the total lumps of the white lumps WD and the black lumps BD is defined as a pattern. Extract.
- the pattern pattern stored in the original photographic paper OP can be obtained as a substantially constant data size, the pattern division result (the total area of the white black WD and black black BD) Regardless of), the pattern can be appropriately stored in the original photographic paper OP.
- a plurality of points for generating a curve approximating the contour are determined based on the points on the contour of the area, and these points are determined.
- extraction means to extract as pattern information switch to grid size and order of Bezier curve according to total area of white WD and black black BD
- the switching may be performed according to the largest lump area of the white lump WD and the black lump BD, and the lattice size and / or the order of the Bezier curve may be set as a fixed value.
- the point that intersects with the square lattice is a point on the outer periphery of the white dummy WD and the black dummy BD.
- a reference point on the outer periphery of the white dummy WD and the black dummy BD is determined, and the reference ⁇ ;
- the point at which the circle intersects with the circle is defined as a point on the outer circumference of the white tama WD and black tama BD, and the point of intersection with the circle centered on this point is defined as the point on the outer circumference of the next white tama WD and black tama BD You may do it.
- the diameter or radius of the circle may be switched according to the total surface of the white balls WD and the black balls BD.
- a control point sequence for generating a Bezier curve is extracted based on points on the outer periphery of the white lumps WD and the black lumps BD, but based on the points on the contours of the white lumps WD and the black lumps BD.
- a control point sequence for generating a Bezier curve may be extracted. In this way, as shown in FIG. 39, even if the donut is divided as a white dough WD (or a black dough BD), the dough is obtained by the same method as described above with reference to FIG. It is possible to extract a control point sequence that can faithfully reproduce the shape of.
- the point can be extracted by not only the method described above with reference to FIG. 34 but also various other methods.
- a control point sequence for generating a Bezier curve is extracted as pattern information, but a control point sequence for generating other various curves such as a rational Bezier curve, a B-spline curve, or a rational B-spline curve is used. It may be extracted as pattern information. In this case, the same effect as in the above-described fourth processing mode can be obtained.
- a Bezier music based on a control point sequence is used as a verification means for verifying the validity of the photographic paper based on the pattern information stored by the storage means.
- a line is generated, and a region surrounded by the Bezier curve is painted with a predetermined single luminance value to generate a reconstructed white ball and a reconstructed black ball, and the reconstructed white ball and the reconstructed black ball are generated.
- various other curves such as a rational Bezier curve, a B-spline curve, or a rational B-spline curve were used based on the control point sequence. May be generated, or verification may be performed by a method other than the phase-only correlation. In this case, the same effect as in the above-described fourth processing mode can be obtained.
- control unit 2 executes the above-described first to fourth processing modes in accordance with the authentication processing procedure RT shown in FIG.
- the control unit 2 starts this authentication processing procedure RT in step SP0, and in the following step SP1, the authentication pattern code BC ( Wait for a print command or a copy command to print Figure 2 (A)).
- step SP1 When receiving a print command from an operation unit (not shown) in step SP1, the control unit 2 controls the scanner unit 4 to acquire the original pattern image data D1 in the following step SP2. After that, in the next step SP3, the low-frequency component extraction processing is performed on the original pattern image data D1 to obtain a low-frequency pattern representing the low-frequency component pattern image IM2 (FIG. 7 (A)).
- the image data D 11 is generated, and in the following step SP 4, the white component pattern image WIM (FIG. 7B) is represented by performing image separation processing on the low-frequency pattern image data D 11. , And black component pattern image data D13 representing a black component pattern image BIM (FIG. 7 (C)).
- step SP5 the control unit 2 performs an area segmentation process on the white component pattern image data D12 and the black component pattern image data D13, so that a plurality of white dummy WDs (WDi Generate white lumps data D14 representing WDj and a plurality of black lumps BD (black lumps data D15 representing BD BDJ are generated.
- a plurality of white dummy WDs WDi Generate white lumps data D14 representing WDj
- black lumps BD black lumps data D15 representing BD BDJ
- step SP7 based on a threshold value set in advance as a switching criterion for switching the processing content of the pattern pattern extraction process, the [2] By sequentially performing one of the pattern pattern extraction processing in the first (or second) processing mode, the pattern pattern extraction processing in the third processing mode, and the pattern pattern extraction processing in the fourth processing mode, the authentication The pattern pattern data D 16 is generated.
- thresholds include a first threshold (hereinafter, referred to as a low threshold), a second threshold (hereinafter, referred to as a medium threshold) larger than the first threshold, '
- the third threshold larger than the second threshold (hereinafter, referred to as an altitude threshold) is set, and the control unit 2 determines that the area of the lumps to be processed is smaller than the lower threshold and equal to or larger than the lower threshold. If it is within the first range less than the medium threshold, the pattern pattern extraction processing in the third processing mode is performed to generate data representing the circle information value as a pattern pattern.
- the control unit 2 selects one of the first or second processing modes.
- pattern pattern extraction processing in the processing mode to generate data representing a rectangular information value (elliptical information value) as a pattern pattern
- the pattern pattern extraction process in the fourth processing mode is performed to generate data representing a control point sequence for generating a Bezier curve as a pattern and a turn.
- the data generated for each of the white balls WD and the black balls BD in this manner is obtained as the authentication pattern data D 16.
- step SP 8 the control unit 2 generates authentication pattern code data D 2 by performing a two-dimensional code conversion process on the authentication pattern pattern data D 16, and then continues In step SP9, printer 5 is controlled to The authentication pattern code BC based on the pattern code data D2 is stored in the photographic paper, and the process returns to step SP1.
- step SP1 when receiving a duplication command from an operation unit (not shown) in step SP1, the control unit 2 controls the scanner unit 4 in the following step SP10 to control the code-added pattern image data D3. Then, in the next step SP 11, the same image processing as that of the original pattern image data D 1 is performed on the code-added pattern image data D 3.
- control unit 2 sequentially performs the low-frequency component extraction processing, the image separation processing, and the area division processing from step SP3 to step SP5 on the code-added pattern image data D3. Thereafter, similarly to the step SP6, the areas of the white lumps WD and the black lumps BD based on the white lumps data D14 and the black lumps data D15 obtained as a result of the area division processing are individually calculated. At this time, only when the area of the lumps is within the first and second ranges, the control unit 2 executes the pattern pattern in the preset first or second processing mode in the same manner as in step SP7 described above. By performing an extraction process, comparative pattern data D 26 is generated.
- the comparative pattern pattern data D 26 generated for the lumps in the first and second ranges, the white lumbar data D 14 and the black lumbar data generated for the lumps in the third range. D15 is obtained as a comparison object with the authentication pattern code BC stored on the photographic paper.
- step SP12 the control section 2 controls the scanner section 4 to store the authentication pattern code data D2 based on the authentication pattern code BC stored on the photographic paper.
- step SP 13 the authentication pattern code data D 2 is compared with the corresponding white dummy data D 14, black dummy data D 15 and comparative pattern pattern data D 26, and the following step SP 1
- step 4 the control unit 2 controls the printer .5 in accordance with the collation result to duplicate the print content of the photographic paper, and returns to step SP1.Thus, the control unit 2 executes the above-described first to fourth processing modes. Can be run It is made to be able to.
- control unit 2 since the control unit 2 extracts a large pattern which is an important feature of the pattern as a detailed pattern, the control unit 2 emphasizes and extracts a particularly characteristic pattern in the pattern while extracting it. Extraction time can be shortened for other lumps. In addition, a pattern defined as a medium pattern in the pattern may be extracted as a detailed pattern. In this case, the extraction time is further reduced while extracting the characteristics of the pattern in the pattern on average. be able to.
- the pattern image is subdivided as lumps, and the feature amount of each lumbar is extracted as a pattern.
- the pattern image is subdivided. The difference is that the pattern image is extracted as a whole by capturing the pattern image without conversion.
- a pattern image of a low-frequency component (hereinafter, referred to as a low-frequency pattern image) is obtained from the pattern image. ), A low-frequency component extraction unit 411 for extracting the pattern, a pattern pattern extraction unit 412 for extracting the pattern in the low-frequency pattern image, and a two-dimensional code for converting the pattern into a two-dimensional barcode.
- a conversion section 4 13, an image reconstruction section 4 14 for reconstructing a low-frequency pattern image from the pattern pattern, and a reconstructed low-frequency pattern image (hereinafter referred to as a reconstructed low-frequency pattern image ) Can be divided into a collating unit 4 15 which verifies the validity of the coded paper XPc (FIG. 2 (B)).
- the control unit 2 applies the low-frequency component extraction unit 411, the pattern pattern extraction unit 4 1 2, and the low-frequency component extraction unit 411 to the original / regular pattern image data D 1 given from the scanner unit 4. ⁇
- Various processes are sequentially performed via the two-dimensional code conversion unit 4 13, and the resulting pattern code data D 2 is sent to the printer unit 5.
- the low-frequency component extraction processing result of the low-frequency component extraction section 4 11 for the code-added pattern image data D 3 given from the scanner section 4 and the scanner section 4 Image data for pattern code data D 2 given from A matching process based on the image reconstruction result in the configuration unit 414 is performed in the matching unit 415.
- low-frequency component extraction processing by the low-frequency component extraction unit 4 11 pattern pattern extraction processing by the pattern pattern extraction unit 4 12, and two-dimensional code by the two-dimensional code conversion unit 4 13
- the conversion process, the image reconstruction process by the image reconstruction unit 414, and the matching process by the matching unit 415 will be described in detail.
- the low-frequency component extraction unit 411 converts the specified area AR (Fig. 2) from the pattern image of the original photographic paper OP (Fig. 2 (A)) or the coded paper XPc (Fig. 2 (B)).
- the region pattern image IMl (Fig. 6 (A)) is acquired, and from this region pattern image IMl, the low-frequency component pattern image IM2 (Fig. B))) is extracted.
- the low-frequency component extraction unit 411 sends the low-frequency pattern image data D 4 11 generated at this time to the pattern pattern extraction unit 4 12 and verifies it.
- the low-frequency pattern image data D 4 11 generated at this time is illuminated and transmitted to the joining section 4 15.
- the low-frequency component extraction unit 411 generally includes the high-frequency components of the image, such as the noise of the solid-state image sensor in the scanner unit 4. Various noise components can be removed.
- the low-frequency component extraction section 4 11 1 can avoid a decrease in pattern pattern extraction accuracy due to the pattern pattern extraction 15 15 12 caused by various noise components.
- the reliability of the collation processing result in the collation unit 4 15 can be improved.
- the pattern / turn extraction unit 412 includes a pixel having the lowest luminance value on a curved surface formed by the luminance value of the low-frequency component pattern image IM2 (hereinafter, referred to as a pixel). This is referred to as a local minimum point) PS (PS i PS n) and a pixel having the highest luminance value on the surface (hereinafter referred to as a local maximum point) PL (P Li P Ln) are detected.
- the average of the luminance values in the low-frequency component pattern image IM2 (hereinafter referred to as the luminance average) is calculated.
- the minimum point PS (the black circle in FIG. 42) is a pixel having a predetermined low luminance value or less (hereinafter, referred to as a white pixel) among the white pixels adjacent to each other.
- the pixel is located at the approximate center of the region in units of a set
- the maximum point PL (black triangle in FIG. 42) is a pixel having a predetermined high luminance value or less (hereinafter, referred to as a black pixel). It is present at the approximate center of the area in units of a set of black pixels adjacent to each other, and these minimum points PS and maximum points PL represent the characteristic points of the pattern included in the low-frequency component pattern image IM2. I understand that there is.
- the detection results of the minimum point PS and the maximum point PL are based on the pattern pattern in the area pattern image IM1 (FIG. 6A) (that is, the characteristic pattern of the pattern included in the area pattern image I Ml). would mean.
- the pattern pattern extracting section 4 12 is a low-frequency component pattern based on the low-frequency pattern image data D 4 11 supplied from the low-frequency component extracting section 4 11. Recognize the horizontal direction of the image IM2 as the X axis, the vertical direction as the y axis, and the luminance value as the z axis.
- Fig. 43 shows the spatial state formed by the luminance values of the low-frequency component pattern image IM2.
- Fig. 43 (A) shows the spatial state of the front
- Fig. 43 (B) shows the spatial state of the side. State, Figure 43 (C) shows the spatial state of the slope.
- the pattern pattern extracting unit 4 12 detects the minimum point PS and the maximum point PL in the low-frequency component pattern image IM2, calculates the average luminance, and calculates the minimum point PS and the maximum point.
- the position in the PL, the luminance value, and the luminance average are transmitted as data (hereinafter, referred to as pattern pattern data) D 4 12 to the two-dimensional code converter 4 13.
- the pattern pattern extraction unit 4 12 can extract the feature of the pattern as a pattern by a simple calculation.
- the data size of the one minimum point PS or maximum point PL was approximately 40 [bit] from the experimental results. Therefore, the pattern pattern extraction unit 412 is capable of generating a pattern feature as pattern data D 4 12 having a small data size.
- the two-dimensional code conversion ⁇ 4 13 stores the pattern in the original photographic paper OP as a pattern code B C (FIG. 2A).
- the two-dimensional code conversion section 4 13 cuts off the decimal part of the supplied pattern pattern data D 41 2, and stores the cut-off pattern pattern data D 41 2 in the information storage memory in advance.
- the pattern code data D 2 is generated by performing a two-dimensional bar code conversion process based on the code character string information, and is transmitted to the printer unit 5 at a predetermined timing.
- the pattern code data D2 is stored in the printer unit 5 at a predetermined position on the photographic paper (original photographic paper OP) set on the photographic paper table.
- the existing two-dimensional barcode has a size of about 1 to
- the image reconstructing unit 414 obtains the minimum point PS (PS ⁇ to uru S n ) and the maximum point PL (P Li P Ln) in the designated area AR shown in FIG. A reconstructed low-frequency pattern image RIM is generated.
- the image reconstructing unit 4 1 4 stores the minimum point PS (PS i P Sj and the maximum point PL ( ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ) of the supplied pattern code data D 2 in the information storage memory. Based on the previously stored area position information, as shown in FIG. 45 (A), the designated area AR (FIG. 2), the minimum point PS (PS 1 to PS n ), and the maximum point PL ( ⁇ ⁇ ⁇ ⁇ ⁇ 1 ⁇ ⁇ ) Recognize each positional relationship.
- the image reconstruction unit 414 executes the Voronoi division process using the recognized minimum point P S and maximum point P L as reference points. That is, as shown in FIG. 46, the image reconstructing unit 414 sets all pixels in the specified area AR (FIG. 2) except for the reference point (the black circle and the black triangle in FIG. 45) at the closest distance from the pixel. Then, a plurality of small areas are generated by dividing each of them as belonging to a reference point to be processed.
- the image reconstruction unit 4 14 generates the designated area AR (FIG. 2) generated by the Voronoi division process based on the luminance information (the luminance value and the luminance average of the minimum point PS and the maximum point PL) of the pattern code data D 2.
- the reconstructed low-frequency pattern image RIM (FIG. 45 (B)) by determining the luminance state in each of the small regions of FIG. It is sent as reconstructed low-frequency pattern image data) D 414 to the matching unit 415.
- the image reconstruction unit 414 includes a small area (hereinafter, referred to as a target small area) adjacent to the target territory castle SAR in a small area to be processed (hereinafter, referred to as a target small area). This is called an adjacent small region) and border BDL with NAR, a reference point having the the target small area SAR (i.e., minimum point PS i ⁇ PS n and maximum point PL i ⁇ PL n (FIG. 42) of The luminance state of a triangle (indicated by hatching in FIG. 47) formed by P and P is sequentially determined using the luminance values of both the boundary line BDL and the reference point P.
- a target small area adjacent to the target territory castle SAR in a small area to be processed
- the image reconstruction unit 414 forms the reference point P and the adjacent small area NAR in the target small area SAR constituting the boundary line BDL. If both of the reference points P are the minimum point PS or the maximum point PL, the average of the luminance values of the reference points P and P ′ and the average luminance of the pattern code data D 2 is defined as the boundary line B DL Calculated as 1 luminance value m.
- the image reconstruction unit 414 determines the brightness values of the reference points P and P ′. Is calculated as the luminance value m of the boundary line BDL1.
- the image reconstructing unit 414 uses the calculated luminance value m of the boundary line BDL 1 to designate a pixel to be determined (hereinafter, referred to as a target pixel) as X, the target pixel X and the reference point P
- a target pixel designate a pixel to be determined (hereinafter, referred to as a target pixel) as X
- the target pixel X and the reference point P Let d5 be the distance of the reference point and Vpeek be the luminance value of the reference point P, and the following equation (17)
- the exponential function V (X) Npeak one Vpeak ⁇ ⁇ ) 'd 2 ( 17), the distance PQ between an intersection Q between the extended line and the boundary line BDL 1 reference point P and the line segment PX as "1"
- the luminance value V (x) of the target pixel X is determined by normalization.
- the image reconstruction unit 414 determines each pixel in the triangle indicated by hatching in accordance with the exponential function of Expression (17). As a result, as is clear from FIGS. 47 (B) and (C), the image reconstructing unit 414 sets the luminance so that the luminance value of each pixel from the reference point P to the boundary line BDL 1 becomes gentle. The value can be determined, and the pattern image can be accurately reproduced.
- Fig. 47 (B) and 47 (C) show that the luminance value m of the boundary line BD L1 is “1 110”, the reference point P is pole / J, the point PS is PS 2 ,. Or PS n , and the reference point P in the adjacent small area NAR is the maximum point PL, PL 2 ,..., or PL n
- Fig. 47 (B) shows the luminance state between the reference point P based on Eq. (1) in a plan view
- Fig. 47 (C) shows the luminance state in a three-dimensional manner.
- the image reconstruction unit 414 also calculates the brightness state of the triangle formed by the boundary lines BDL2 to BDL4 and the reference point P by using the boundary line BDL1 and the reference point P. Each of them is determined in the same manner as the triangle formed.
- the image reconstruction unit 414 starts this image processing procedure RT in step SP 20, and proceeds to step SP ′ 21, Recognize the positional relationship between the minimum point PS (PS! To PS N ) and the maximum point PL (PLP LJ and the designated area AR (FIG. 2) of the pattern code data D 2, proceed to the next step SP 22, and perform Voronoi division Executes the process and divides the specified area AR to generate multiple small areas.
- step SP 23 calculates all the luminance values m of the generated boundary line BDL (FIG. 47) of each small area, and then proceeds to step SP 24 to
- the luminance state (luminance value of each pixel) in the area SAR is sequentially determined according to the exponential function of Equation (17), and the process proceeds to step SP25, where the luminance state of all the small areas in the specified area AR is determined. Is determined.
- the image reconstructing unit 414 if a negative result is obtained, the image reconstructing unit 414 returns to step SP24 and repeats the above processing with the remaining one small area as the target small area SAR. On the other hand, if a positive result is obtained, the image reconstruction unit 414 recognizes the reconstructed low-frequency pattern image RIM obtained at this time as the next processing target, and After transmitting the image RIM as the reconstructed low-frequency pattern image data D 414 to the matching unit 4 15, the process proceeds to step SP 26 to end the image processing procedure RT.
- the image reconstructing unit 414 executes the image reconstructing process in this manner, and corresponds to the low-frequency component pattern image IM2 (FIG. 42) and the low-frequency component pattern image IM2.
- FIG. 49 comparing the reconstructed low-pass pattern image RIM (FIG. 45 (B)) with the reconstructed low-pass pattern image RIM (see FIG. Reconstructed low-frequency pattern image RIM can be accurately reproduced from PS and local maximum point PL).
- the matching unit 4 15 generates the reconstructed low-frequency pattern image RIM and the low-frequency component pattern image IM extracted from the coded paper XPc (FIG. 2 (B)) at this time. Compare with 2.
- the matching unit 4 15 includes the reconstructed low-frequency pattern image data D 4 14 supplied from the image reconstruction unit 4 14 and the low-frequency component extraction unit 4 11 1 from the scanner unit 4 at this time.
- a predetermined cross-correlation process is applied to the low-frequency component pattern image data D 411 supplied via the Calculate the match rate.
- the collating unit 415 copies the coded paper XPc (FIG. 2) placed on the mounting table of the scanner unit 4 at this time. It is determined that the copying is prohibited, and at this time, the fact that copying is prohibited is notified via the display unit (not shown) of the scanner unit 4.
- the collating unit 4 15 sets the code-added photographic paper XPc placed on the mounting table of the scanner unit 4 at this time (FIG. 2). Is a valid original photographic paper OP, and generates a copy permission command C ⁇ M (FIG. 5) and sends it to the scanner unit 4 (FIG. 4).
- the scanner unit 4 executes the reflection mode, and the printing content of the photographic paper XPc (original photographic paper OP) placed on the platen at this time is set as the printing content image data D4 in the printer unit.
- the print content of the original photographic paper #P (FIG. 2A) is copied onto the paper in the printer unit 5 as a result.
- the matching unit 4 15 executes the matching process, and only when the matching result is higher than a predetermined threshold as the matching result, the original photographic paper OP (FIG. 2 (A) ) Can be permitted to be reproduced.
- the control unit 2 captures the entire pattern image and extracts the pattern pattern without subdividing the pattern image, thereby obtaining the pattern of the first embodiment.
- the processing load can be remarkably reduced as compared with the case.
- the unauthorized duplication apparatus 1 extracts a pattern (pattern feature) from the pattern image included in the original photographic paper ⁇ P (FIG. 2 (A)), and uses this pattern as authentication target information. It is stored in the original photographic paper OP.
- the unauthorized copying apparatus 1 determines whether the original photographic paper OP is based on the pattern stored in the coded paper XPc. Verify the validity of no.
- the unauthorized duplication device 1 since the presence or absence of the original can be identified by the pattern pattern of the photographic paper itself, the unauthorized duplication can be easily prevented without using special paper or the like. it can.
- the owner of the original photographic paper OP can copy the original photographic paper OP without considering illegal duplication of the duplicate photographic paper.
- a pattern pattern (pattern feature amount) extracted from a pattern image included in the original photographic paper is stored in the original photographic paper, and the photographic print is copied when the print content of the coded paper XPc is copied.
- the photographic paper OP By checking the validity of the original photographic paper OP based on the pattern pattern stored in the paper XP c, the photographic paper itself has a V pattern Since the presence or absence of the original can be identified, unauthorized duplication can be easily prevented without using special paper or the like, and thus the contents of the stamp surface can be easily protected.
- a pattern is imaged by an imaging unit (scanner unit 4) as an acquiring unit for acquiring edge pattern information based on the pattern of the photographic paper, and the pattern pattern is obtained from the imaged pattern image.
- imaging unit scanner unit 4
- the present invention is not limited to this.
- information about the pattern may be obtained by electrophoresis after the pattern is made visible by a chemical.
- the scanner unit 4 that executes the transmission mode, the reflection mode, and the code reading mode is applied as an imaging unit that captures an image of a pattern included in photographic paper.
- the present invention is not limited to this. In other words, the present invention can be applied to various other imaging units that irradiate light on photographic paper and generate a pattern image signal from the projection light via a solid-state imaging device.
- the region division processing and the pattern pattern extraction processing are executed without executing the low frequency component processing or the image separation processing, and conversely, Only low-frequency component processing and image separation processing are performed, and the resulting white component pattern image WIM (Fig. 7 (B)) and black component pattern image BIM (Fig. 7 (C)) are used as pattern information.
- the low-frequency component pattern image IM2 (FIG. 7 (A)) obtained as a result can be extracted as pattern information.
- the pattern image picked up by the image pickup means is divided into, for example, a 5 ⁇ 5 image area, and one of the extracted areas is extracted as pattern information.
- the feature amount may be extracted after the pattern included in one of the extracted regions is divided in the same manner as the region dividing process in the region dividing unit 13.
- the content of the pattern pattern extraction processing for the lumps is switched according to the area of the lumps. May be switched in accordance with.
- various targets such as the pixel amount of the edge can be adopted.
- the low frequency component processing can be omitted.
- a value obtained by approximating the shape of the lumps to the shape of a rectangle, an ellipse, or a circle Is extracted, but a value approximated to a shape other than these may be extracted.
- the feature of the pattern image is extracted from the specified area AR (Fig. 2), but it may be extracted from multiple specified areas, or it may be extracted from the entire pattern image. Is also good.
- the minimum point PS (FIG. 42), the maximum point PL (FIG. 42) and the luminance average are extracted, but only the minimum point PS is extracted. May extract only the maximum point PL, or may extract only the minimum point PS and the maximum point PL, or may extract predetermined pixels having various luminance values as the minimum point PS and the maximum point PL. You may do it.
- the feature of the pattern image is extracted from the designated area AR (Fig. 2), but may be extracted from a plurality of designated areas, or may be extracted from the entire pattern image. You may.
- white pixels and black pixels are set so that the number of white pixels and black pixels is 20% of all pixels in the low-frequency component pattern image IM2 (FIG. 7A).
- the component pattern image WIM black component pattern image BIM
- the luminance value at the center of the luminance range in the luminance histogram is determined. Pixels having a brightness value equal to or less than a certain brightness value (or more) from the determined brightness value may be separated as a white component pattern image WIM (black component pattern image BIM).
- the middle and middle brightness values are, for example, the brightness value with the largest number of pixels, and the brightness histogram is represented by a histogram curve at an arbitrary frequency (number of pixels) as shown in FIG. 50 (B). Or the luminance value at the center between the two.
- the average luminance value of the luminance values of all the pixels in the low-frequency component pattern image I M2 (FIG. 7 (A)) is calculated as this image separation processing, and the calculated average luminance value is calculated.
- Pixels having a luminance value equal to or less than a certain luminance value (or more) may be separated as a white component pattern image WIM (black component pattern image BIM).
- the point is that the images of the low luminance component and the high luminance component (white component pattern image WIM and black component image) have a relative area ratio to the pattern image before separation (low-frequency component pattern image IM2). If the image is separated into the component pattern images BIM), the same effect as in the above-described embodiment can be obtained.
- a plurality of small areas are divided by dividing the designated area AR by Voronoi division using the minimum point PS (FIG. 42) and the maximum point PL (FIG. 42) of the pattern as reference points.
- An area is generated (Fig. 46), and the brightness state between the reference points P and P '(Fig. 47 (C)) in adjacent areas is smoothed using the brightness value of the reference point.
- the present invention is not limited to this, and a plurality of small areas may be generated by other various division methods. Alternatively, the luminance state of the small area may be determined by various methods.
- the designated area AR can be divided by a table in which the position states of the minimum point P S (FIG. 42) and the maximum point P L (FIG. 42) are associated with the division results.
- the brightness state of each small area is changed according to the exponential function of equation (1) so that the brightness state between reference points P and P '(Fig. 47 (C)) in the areas adjacent to each other becomes gentle.
- it can be determined according to a linear function, or can be determined according to a table in which the distance between the reference points P and P ′ and the luminance state therebetween are associated.
- the luminance average of the low-frequency component image I M2 is used as necessary, and the small regions adjacent to each other are used.
- the brightness value of the boundary line BD (Fig. 47) was calculated. Almost the same effects as in the above embodiment can be obtained without using the luminance average.
- a pattern pattern (a pattern feature amount) is used as the storage means for storing the pattern information on the photographic paper. ) Is printed on photographic paper (original photographic paper ⁇ P) as an authentication code (two-dimensional barcode) BC (Fig. 2 (A)).
- the present invention is not limited to this. Holes and Braille corresponding to the pattern may be provided on the photographic paper, or the pattern (pattern feature) may be directly described on the photographic paper.
- Various other pattern information can be stored.
- a pattern pattern is used as a verification means for verifying the validity of a photographic paper based on the pattern information stored by the storage means, as shown in FIG. 14 to FIG. 18 or FIG.
- the verification is performed by using the method described above with reference to FIGS. 26 and 30 to FIG. 32
- the present invention is not limited to this.
- the pattern obtained by the above-described obtaining means is used.
- a collation method according to the information can be adopted
- a program that causes the control unit to execute the various processes shown in FIG. 5 or a part of the processes may be installed in an existing apparatus that handles paper such as a copy machine or a newly manufactured apparatus. You may do it.
- the present invention can be used when paper is used as various media such as a commodity exchange medium such as money, a content proof medium such as a certificate, or an information storage medium such as a personal work.
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/586,359 US7856143B2 (en) | 2004-01-22 | 2005-01-21 | Unauthorized copy preventing device and method thereof, and program |
JP2005517331A JP4461389B2 (en) | 2004-01-22 | 2005-01-21 | Unauthorized copy prevention apparatus, method and program |
CN200580002880.5A CN1910900B (en) | 2004-01-22 | 2005-01-21 | Unauthorized copy preventing device and method thereof, and program |
EP05704229A EP1708477A4 (en) | 2004-01-22 | 2005-01-21 | Unauthorized copy preventing device and method thereof, and program |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-14494 | 2004-01-22 | ||
JP2004014494 | 2004-01-22 | ||
JP2004-41992 | 2004-02-18 | ||
JP2004041992 | 2004-02-18 | ||
JP2004-55498 | 2004-02-27 | ||
JP2004055498 | 2004-02-27 | ||
JP2004067856 | 2004-03-10 | ||
JP2004-67856 | 2004-03-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005071939A1 true WO2005071939A1 (en) | 2005-08-04 |
Family
ID=34812176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/001176 WO2005071939A1 (en) | 2004-01-22 | 2005-01-21 | Unauthorized copy preventing device and method thereof, and program |
Country Status (6)
Country | Link |
---|---|
US (1) | US7856143B2 (en) |
EP (1) | EP1708477A4 (en) |
JP (1) | JP4461389B2 (en) |
CN (1) | CN1910900B (en) |
TW (1) | TWI292136B (en) |
WO (1) | WO2005071939A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009508378A (en) * | 2005-09-08 | 2009-02-26 | インジェニア・ホールディングス・(ユー・ケイ)・リミテッド | copy |
WO2013018616A1 (en) * | 2011-07-29 | 2013-02-07 | 日本電気株式会社 | Verification method, tag creating apparatus, verification apparatus, tag, and program |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7162035B1 (en) | 2000-05-24 | 2007-01-09 | Tracer Detection Technology Corp. | Authentication method and system |
JP4530763B2 (en) * | 2004-08-23 | 2010-08-25 | 富士機械製造株式会社 | Part data creation method and part data creation apparatus |
JP4212586B2 (en) * | 2005-11-25 | 2009-01-21 | シャープ株式会社 | Image reading apparatus, image forming apparatus, image processing system, and image reading method |
JP4720815B2 (en) * | 2007-11-07 | 2011-07-13 | 富士ゼロックス株式会社 | Image forming system, image forming apparatus, image forming program, and pattern information collating apparatus |
US7995196B1 (en) | 2008-04-23 | 2011-08-09 | Tracer Detection Technology Corp. | Authentication method and system |
EP2433244A4 (en) * | 2009-05-21 | 2012-11-14 | Hewlett Packard Development Co | Imaging a print aberration |
US9749607B2 (en) | 2009-07-16 | 2017-08-29 | Digimarc Corporation | Coordinated illumination and image signal capture for enhanced signal detection |
JP5854802B2 (en) * | 2011-12-01 | 2016-02-09 | キヤノン株式会社 | Image processing apparatus, image processing method, and computer program |
CN105069893B (en) * | 2015-08-17 | 2017-09-29 | 深圳怡化电脑股份有限公司 | A kind of method and device for detecting bank note |
TWI601406B (en) * | 2016-07-06 | 2017-10-01 | 虹光精密工業股份有限公司 | Image processing device, copy apparatus and method generating generation-count information |
US11062108B2 (en) | 2017-11-07 | 2021-07-13 | Digimarc Corporation | Generating and reading optical codes with variable density to adapt for visual quality and reliability |
US10896307B2 (en) | 2017-11-07 | 2021-01-19 | Digimarc Corporation | Generating and reading optical codes with variable density to adapt for visual quality and reliability |
US10872392B2 (en) | 2017-11-07 | 2020-12-22 | Digimarc Corporation | Generating artistic designs encoded with robust, machine-readable data |
US20190213705A1 (en) | 2017-12-08 | 2019-07-11 | Digimarc Corporation | Artwork generated to convey digital messages, and methods/apparatuses for generating such artwork |
US10748232B2 (en) | 2018-06-08 | 2020-08-18 | Digimarc Corporation | Generating signal bearing art using stipple, voronoi and delaunay methods and reading same |
CN109151423B (en) * | 2018-10-31 | 2021-03-30 | 歌尔光学科技有限公司 | Projector, projector discrimination method, projector discrimination device, information adding method, and storage medium |
CN110802957B (en) * | 2019-10-11 | 2021-11-02 | 杭州珐珞斯科技有限公司 | Printing quantity control method and system for printing equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001309157A (en) * | 2000-04-26 | 2001-11-02 | Ntt Data Corp | Document authentication method, system, document generator, document authentication device and recording medium |
JP2001319257A (en) * | 2000-05-12 | 2001-11-16 | Printing Bureau Ministry Of Finance | Printed matter authenticating device |
JP2003044257A (en) * | 2001-08-02 | 2003-02-14 | Dainippon Printing Co Ltd | Printed matter, printing system and reader |
JP2003319170A (en) * | 2002-02-01 | 2003-11-07 | Markany Inc | Apparatus and method for producing document to prevent its forgery or alteration, and apparatus and method for authenticating document |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020170966A1 (en) * | 1995-07-27 | 2002-11-21 | Hannigan Brett T. | Identification document including embedded data |
US6650761B1 (en) * | 1999-05-19 | 2003-11-18 | Digimarc Corporation | Watermarked business cards and methods |
US5974150A (en) * | 1997-09-30 | 1999-10-26 | Tracer Detection Technology Corp. | System and method for authentication of goods |
AU2001240968A1 (en) * | 2000-03-14 | 2001-09-24 | Dexrad (Pty) Ltd | Generating a non-reproducible printed image |
US7028188B1 (en) | 2000-10-30 | 2006-04-11 | Hewlett-Packard Development Company, L.P. | Document authentication using the physical characteristics of underlying physical media |
JP2003168084A (en) * | 2001-11-30 | 2003-06-13 | Sanyo Electric Co Ltd | Personal identification system and method himself/ herself |
JP2003168071A (en) * | 2001-11-30 | 2003-06-13 | Sanyo Electric Co Ltd | Method for reading two-dimensional bar code |
US7054461B2 (en) * | 2002-02-15 | 2006-05-30 | Pitney Bowes Inc. | Authenticating printed objects using digital watermarks associated with multidimensional quality metrics |
US20030210803A1 (en) | 2002-03-29 | 2003-11-13 | Canon Kabushiki Kaisha | Image processing apparatus and method |
JP4265180B2 (en) | 2002-09-09 | 2009-05-20 | 富士ゼロックス株式会社 | Paper identification verification device |
-
2005
- 2005-01-21 EP EP05704229A patent/EP1708477A4/en not_active Withdrawn
- 2005-01-21 US US10/586,359 patent/US7856143B2/en not_active Expired - Fee Related
- 2005-01-21 TW TW094101847A patent/TWI292136B/en not_active IP Right Cessation
- 2005-01-21 JP JP2005517331A patent/JP4461389B2/en not_active Expired - Fee Related
- 2005-01-21 WO PCT/JP2005/001176 patent/WO2005071939A1/en active Application Filing
- 2005-01-21 CN CN200580002880.5A patent/CN1910900B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001309157A (en) * | 2000-04-26 | 2001-11-02 | Ntt Data Corp | Document authentication method, system, document generator, document authentication device and recording medium |
JP2001319257A (en) * | 2000-05-12 | 2001-11-16 | Printing Bureau Ministry Of Finance | Printed matter authenticating device |
JP2003044257A (en) * | 2001-08-02 | 2003-02-14 | Dainippon Printing Co Ltd | Printed matter, printing system and reader |
JP2003319170A (en) * | 2002-02-01 | 2003-11-07 | Markany Inc | Apparatus and method for producing document to prevent its forgery or alteration, and apparatus and method for authenticating document |
Non-Patent Citations (1)
Title |
---|
See also references of EP1708477A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009508378A (en) * | 2005-09-08 | 2009-02-26 | インジェニア・ホールディングス・(ユー・ケイ)・リミテッド | copy |
WO2013018616A1 (en) * | 2011-07-29 | 2013-02-07 | 日本電気株式会社 | Verification method, tag creating apparatus, verification apparatus, tag, and program |
Also Published As
Publication number | Publication date |
---|---|
EP1708477A1 (en) | 2006-10-04 |
EP1708477A4 (en) | 2008-04-09 |
CN1910900B (en) | 2011-05-25 |
CN1910900A (en) | 2007-02-07 |
JPWO2005071939A1 (en) | 2007-09-06 |
US20070160401A1 (en) | 2007-07-12 |
TW200539069A (en) | 2005-12-01 |
TWI292136B (en) | 2008-01-01 |
US7856143B2 (en) | 2010-12-21 |
JP4461389B2 (en) | 2010-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005071939A1 (en) | Unauthorized copy preventing device and method thereof, and program | |
KR20200032206A (en) | Face recognition unlocking method and device, device, medium | |
JP2000333005A (en) | Pattern detection method, pattern detection device and recording medium | |
KR20200118842A (en) | Identity authentication method and device, electronic device and storage medium | |
JP2011507101A (en) | Identification and verification of unknown documents by eigenimage processing | |
US20070041628A1 (en) | Detection of document security marks using run profiles | |
US20100239128A1 (en) | Registering device, checking device, program, and data structure | |
CN109840875A (en) | A kind of anti-counterfei waterprint generation method, device, electronic equipment and storage medium | |
CN106169064A (en) | The image-recognizing method of a kind of reality enhancing system and system | |
JP2004112223A (en) | Id card, id card generating apparatus, and id card reader | |
JP7101258B2 (en) | 2D barcode generation method, authentication method, server, and 2D barcode | |
Ferrara et al. | On the impact of alterations on face photo recognition accuracy | |
JP2000163595A (en) | Mark detecting method and device | |
JP2006338330A (en) | Device and method for identifying slip of paper | |
US7961941B2 (en) | Color form dropout using dynamic geometric solid thresholding | |
US20190102617A1 (en) | System and method of training a classifier for determining the category of a document | |
CN111611994B (en) | Image extraction method, device, electronic equipment and storage medium | |
KR20210024877A (en) | Method and apparatus for determining liveness | |
CN115391751A (en) | Infringement determination method | |
JPH08287259A (en) | Fingerprint identifying method | |
JP2006338548A (en) | Printing paper sheet management system, printing paper sheet registration device, method, and program, printing paper sheet discrimination device, method and program | |
CN112597810A (en) | Identity document authentication method and system | |
JP2020184735A (en) | Verification method | |
US11872832B2 (en) | Texture-based authentication of digital identity documents | |
JP4446185B2 (en) | Information generating apparatus and method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005517331 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005704229 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007160401 Country of ref document: US Ref document number: 10586359 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580002880.5 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2005704229 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10586359 Country of ref document: US |