JP5381069B2 - Information processing apparatus and program - Google Patents

Information processing apparatus and program Download PDF

Info

Publication number
JP5381069B2
JP5381069B2 JP2008316888A JP2008316888A JP5381069B2 JP 5381069 B2 JP5381069 B2 JP 5381069B2 JP 2008316888 A JP2008316888 A JP 2008316888A JP 2008316888 A JP2008316888 A JP 2008316888A JP 5381069 B2 JP5381069 B2 JP 5381069B2
Authority
JP
Japan
Prior art keywords
image
comparison
basic
element
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008316888A
Other languages
Japanese (ja)
Other versions
JP2010140313A (en
Inventor
哲也 木村
健介 伊藤
Original Assignee
富士ゼロックス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士ゼロックス株式会社 filed Critical 富士ゼロックス株式会社
Priority to JP2008316888A priority Critical patent/JP5381069B2/en
Publication of JP2010140313A publication Critical patent/JP2010140313A/en
Application granted granted Critical
Publication of JP5381069B2 publication Critical patent/JP5381069B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an information processing apparatus and a program.

  There are devices that determine the authenticity of an individual by features having randomness distributed along the surface of the individual such as a paper document.

For example, an identifier for identifying an individual and an image corresponding to the individual are stored in association with each other, and an image corresponding to the individual such as a paper document to be compared is stored in association with an identifier printed on the individual. There is a technique for determining the authenticity of an individual based on whether or not the corresponding image corresponds. Further, for example, Patent Document 1 discloses a technique for determining the authenticity of an individual based on a correlation value between an image and an image compared with the image.
JP 2005-38389 A

  An object of the present invention is to provide an information processing apparatus and a program for efficiently determining whether or not an image corresponds.

  The invention according to claim 1 is an information processing apparatus, a comparison image acquisition means for acquiring a comparison image to be compared with a basic image, and a comparison calculated based on a comparison image element in the comparison image Image element correspondence determining means for determining whether or not a feature value corresponds to a basic feature value calculated based on a basic image element in the basic image corresponding to the comparison image element; and the comparison image Image correspondence for determining whether or not the basic image and the comparison image correspond to each other based on a determination result determined by the image element correspondence determination unit for each of at least one of the comparison image elements whose positions in the image are different from each other And a determination means.

  The invention according to claim 2 is the information processing apparatus according to claim 1, wherein the image element correspondence determining unit binarizes the comparison image element and calculates the comparison feature value. It is determined whether or not the basic feature value calculated by binarizing the image element corresponds.

  A third aspect of the present invention is the information processing apparatus according to the first or second aspect, wherein the image correspondence determination means causes the comparison feature value to correspond to the basic feature value by the image element correspondence determination means. Then, based on at least one of the number of times determined to be determined or the number of times determined not to correspond, it is determined whether or not the basic image corresponds to the comparative image.

  A fourth aspect of the present invention is the information processing apparatus according to any one of the first to third aspects, wherein the image element correspondence determining unit calculates a comparison feature value based on the comparison image element. And a basic feature value calculated based on the basic image element corresponding to the position of the comparative image element in the comparative image corresponds to the position in the basic image. Is.

  A fifth aspect of the present invention is the information processing apparatus according to any one of the first to fourth aspects, wherein the image element correspondence determination means is at least a partial region in the comparison image. The comparison feature value calculated based on the comparison image element in the region and the position in the basic region that is at least a partial region in the basic image correspond to the position of the comparison image element in the comparison region. It is determined whether or not the basic feature value calculated based on the basic image element corresponds to each other, and the image correspondence determining unit is configured for each of at least one of the comparative image elements whose positions in the comparison region are different from each other. Based on the determination result determined by the image element correspondence determination means, it is determined whether or not the basic image corresponds to the comparison image.

  The invention according to claim 6 is the information processing apparatus according to claim 5, wherein the position in the comparison region associated with the positional relationship information indicating the positional relationship between the comparison region and the base region is Determination result information generating means for generating determination result information indicating a determination result determined by the image element correspondence determination means for each of at least one comparison image element different from each other, wherein the image correspondence determination means is associated with Further, it is determined whether or not the basic image and the comparison image correspond to each other based on a plurality of the determination result information having different positional relationships indicated by the positional relationship information.

  A seventh aspect of the present invention is the information processing apparatus according to any one of the first to sixth aspects, wherein the image correspondence determining means has at least one of the comparisons having different positions in the comparison image. When it is determined that the basic image and the comparative image correspond to each other based on the determination result determined by the image element correspondence determining unit for each image element, the basic image and the comparative image are further verified. Thus, it is determined whether or not the basic image corresponds to the comparative image.

  The invention according to claim 8 is a program, a comparison image acquisition means for acquiring a comparison image to be compared with a basic image, a comparison feature value calculated based on a comparison image element in the comparison image, Image element correspondence determining means for determining whether or not a basic feature value calculated based on a basic image element in the basic image corresponding to the comparison image element corresponds, and a position in the comparative image is Image correspondence determination means for determining whether or not the basic image and the comparison image correspond to each other based on the determination result determined by the image element correspondence determination means for each of at least one comparison image element different from each other. This is to make the computer function.

  According to the first and eighth aspects of the present invention, it is efficiently determined whether or not an image corresponds to a case where the present configuration is not provided.

  According to the second aspect of the present invention, it is more efficiently determined whether or not the image corresponds to the case where the present configuration is not provided.

  According to the third aspect of the present invention, it is easily determined using the number of times whether or not the image corresponds to the case where the present configuration is not provided.

  According to the fourth aspect of the present invention, it is determined whether or not an image corresponds by comparing between feature values corresponding to positions in the image.

  According to the fifth aspect of the present invention, it is determined whether or not the image corresponds by comparing between feature values corresponding to positions in at least a part of the region in the image.

  According to the sixth aspect of the present invention, the determination as to whether or not an image corresponds is performed based on a plurality of pieces of determination result information in which the positional relationship between the comparison region and the base region is different.

  According to the seventh aspect of the present invention, it is determined more strictly whether or not the image corresponds to the case where the present configuration is not provided.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.

  FIG. 1 is a diagram illustrating an example of a hardware configuration of the information processing apparatus 10 according to the present embodiment. As illustrated in FIG. 1, the information processing apparatus 10 according to the present embodiment includes a control unit 12, a storage unit 14, a user interface (UI) unit 16, and a scanner unit 18. . These elements are connected via a bus or the like.

  The control unit 12 is a program control device such as a CPU, and operates according to a program installed in the information processing apparatus 10. The storage unit 14 is a storage element such as a ROM or a RAM, a hard disk, or the like. The storage unit 14 stores a program executed by the control unit 12. The storage unit 14 also operates as a work memory for the control unit 12. The UI unit 16 is a display, a microphone, a speaker, a button, and the like, and outputs the content of the operation performed by the user and the voice input by the user to the control unit 12. In addition, the UI unit 16 displays and outputs information according to an instruction input from the control unit 12. The scanner unit 18 optically reads an image formed on a paper medium or the like, and outputs it as image information to the control unit 12.

  FIG. 2 is a functional block diagram illustrating an example of functions realized by the information processing apparatus 10 according to the present embodiment.

  As illustrated in FIG. 2, the information processing apparatus 10 includes a basic information storage unit 20, a basic information generation unit 22, a comparative image acquisition unit 24, an image element correspondence determination unit 26, a determination result information generation unit 28, and an image correspondence determination unit. 30 functions. The basic information storage unit 20 is realized mainly by the storage unit 14. Other elements are realized mainly by the control unit 12.

  These elements are realized by the control unit 12 executing a program installed in the information processing apparatus 10 that is a computer. The program is supplied to the information processing apparatus 10 via a computer-readable information transmission medium such as a CD-ROM or DVD-ROM or via a communication network such as the Internet.

  The basic information storage unit 20 stores basic information including a basic image to be compared with a comparative image, which will be described later, and basic feature information 32 based on the basic image (see FIG. 3). FIG. 3 shows an example of the data structure of the basic feature information 32. The basic information generation unit 22 acquires a basic image and generates basic information based on the basic image. An example of the flow of basic information generation processing for generating basic information by the basic information generation unit 22 will be described with reference to the flowchart illustrated in FIG.

  First, the basic information generation unit 22 acquires a basic image obtained by optically reading a paper medium or the like with the scanner unit 18 (S101). In the present embodiment, the basic image may be an image corresponding to the entire paper medium, or may be an image in an area where a position in the paper medium is predetermined.

  Then, the basic information generation unit 22 determines whether or not each pixel included in the basic image is noise as necessary, and the image density (for example, RGB value) of the pixel determined to be noise. Is converted based on a predetermined rule (S102). Specifically, for example, the basic information generation unit 22 measures the image density of each pixel and calculates the average value m and the standard deviation σ of the image density. Then, the basic information generation unit 22 calculates the image density of pixels whose image density is more than 3 times the standard deviation σ from the average value m (the image density is greater than m + 3σ or less than m−3σ), for example, the average value The image density (m + 3σ or m−3σ) that is three times the standard deviation σ from m, or the average value of the image density of the surrounding pixels is converted. Note that the basic information generation unit 22 does not necessarily have to execute the process exemplified in S102.

  Then, the basic information generation unit 22 divides at least a part of the basic image (for example, an image in a basic region whose position in the basic image is predetermined) into a plurality of basic image elements (S103). In the present processing example, the basic image element corresponds to a pixel included in the basic region. Note that the basic image element may include a plurality of pixels. In addition, the basic information generation unit 22 may divide the entire basic image into a plurality of basic image elements. That is, the entire basic image may correspond to the basic region.

  Then, the basic information generation unit 22 generates basic feature information 32 including at least one basic feature value 34 based on the basic image (S104). Specifically, for example, for each basic image element, the basic information generation unit 22 corresponds to an image density corresponding to the basic image element (for example, an image density of a pixel corresponding to the basic image element or a basic image element). Based on the comparison result between the image density of a plurality of pixels and a predetermined image density, a basic feature value 34 corresponding to the basic image element is calculated. For example, the basic information generation unit 22 sets the basic feature value 34 corresponding to the basic image element to “1” if the image density of the pixel is equal to or higher than the predetermined image density, and less than the predetermined image density. If there is, set “0”. As described above, the basic information generation unit 22 may calculate the basic feature value 34 by binarizing each pixel of the basic image. Since the information amount of the basic feature value 34 calculated by binarization is 1 bit, for example, when the information amount of the image density of the pixel corresponding to the basic image element is 8 bits (1 byte), the basic feature value The information amount of the value 34 is 1/8 of the information amount of the image density of the basic image element.

  As illustrated in FIG. 3, the basic feature information 32 includes, for example, a plurality of combinations of basic position information 36 and basic feature values 34. The basic position information 36 indicates, for example, the position of the basic image element in the basic region. For example, the value (coordinate value) of the basic position information 36 corresponding to the basic image element that is mth from the left and nth from the top in the basic region is (m, n). The basic feature information 32 may be a binarized image 38 as illustrated in FIG.

  Then, basic information including the basic feature information 32 generated in the process illustrated in S103 and the basic image after the conversion process generated in the process illustrated in S102 is generated and output to the basic information storage unit 20. (S105).

  In the present embodiment, for example, the basic information generation unit 22 generates basic information corresponding to each basic image for each of a plurality of different basic images and outputs the basic information to the basic information storage unit 20. .

  The basic information may be stored in advance in the basic information storage unit 20.

  The comparison image acquisition unit 24 acquires a comparison image (for example, a comparison image obtained by optically reading a paper medium or the like by the scanner unit 18) to be compared with the basic image. The comparison image may be an image corresponding to the entire paper medium, or may be an image in an area where a position in the paper medium is predetermined.

  The image element correspondence determination unit 26 calculates the comparison feature value 40 calculated based on the comparison image element in the comparison image and the basic feature calculated based on the basic image element in the basic image corresponding to the comparison image element. It is determined whether or not the value 34 corresponds (see FIG. 6). FIG. 6 is a diagram illustrating an example of a data structure of the comparison feature information 42 including the comparison feature value 40. Details of the comparison feature information 42 will be described later.

  The determination result information generation unit 28 generates determination result information indicating the determination result determined by the image element correspondence determination unit 26.

  The image correspondence determination unit 30 determines the determination result determined by the image element correspondence determination unit 26 (for example, the determination result generated by the determination result information generation unit 28) for each of at least one comparison image element having a different position in the comparison image. Based on the information, it is determined whether or not the basic image and the comparison image correspond to each other.

  Here, FIG. 7 illustrates an example of the flow of image correspondence determination processing by the comparative image acquisition unit 24, the image element correspondence determination unit 26, the determination result information generation unit 28, and the image correspondence determination unit 30 in the present embodiment. This will be described with reference to a flow diagram.

  First, the comparative image acquisition unit 24 acquires a comparative image (S201). Then, the image element correspondence determination unit 26 converts the image density of the pixels included in the comparison image as necessary by the same processing as in S102 described above (S202).

  Then, the image element correspondence determination unit 26 divides at least a part of the comparison image (for example, an image in a comparison region whose position in the comparison image is predetermined) into a plurality of comparison image elements (S203). In this processing example, the comparison image element corresponds to a pixel included in the comparison area. Note that the comparative image element may include a plurality of pixels. Further, the image element correspondence determination unit 26 may divide the entire comparison image into a plurality of comparison image elements. That is, the entire comparison image may correspond to the comparison area. In this processing example, the number of comparison image elements corresponds to (for example, matches) the number of basic image elements.

  Then, the image element correspondence determination unit 26 generates comparison feature information 42 including at least one comparison feature value 40 based on the comparison image (S204). More specifically, the image element correspondence determination unit 26 performs, for example, an image density corresponding to the comparison image element (for example, the pixel corresponding to the comparison image element) for each comparison image element by the same process as in S104 described above. The comparison feature value 40 (corresponding to the comparison image element) based on the comparison result between the image density and the average image density of a plurality of pixels corresponding to the comparison image element and a predetermined image density. For example, a comparison feature value 40) indicating “1” or “0” is calculated. The image element correspondence determination unit 26 may generate the comparison feature information 42 including the comparison feature value 40 calculated by binarizing the comparison image element as described above.

  As illustrated in FIG. 6, the comparison feature information 42 includes, for example, a plurality of combinations of comparison position information 44 and comparison feature values 40. The comparison position information 44 indicates, for example, the position of the comparison image element in the comparison area. For example, the value (coordinate value) of the comparison position information 44 corresponding to the comparison image element that is mth from the left and nth from the top in the comparison area is (m, n). The comparison feature information 42 may be a binarized image 38 as exemplified in FIG.

  Then, the image element correspondence determining unit 26, for each combination of the basic image element and the comparison image element corresponding to each other, a basic feature value 34 corresponding to the basic image element, a comparison feature value 40 corresponding to the comparison image element, Are compared to determine whether these values correspond (for example, match) (S205).

  At this time, the image element correspondence determining unit 26, for example, compares the comparison feature value 40 corresponding to each comparison image element and the position (for example, coordinates) of the comparison image element in the comparison region (or in the comparison image). It may be determined whether or not the basic feature value 34 corresponding to the basic image element corresponding to the position (for example, coordinates) in the basic region (or in the basic image) corresponds. That is, each combination of the basic feature value 34 and the comparative feature value 40 associated with the comparison position information 44 corresponding to the basic position information 36 associated with the basic feature value 34 is compared with the basic feature value 34. It may be determined whether or not the feature value 40 corresponds (for example, matches).

  Then, the determination result information generation unit 28 generates determination result information indicating the determination result by the image element correspondence determination unit 26 (S206). In the present embodiment, specifically, for example, the number of times that the comparison feature value 40 and the basic feature value 34 are determined to correspond by the image element correspondence determination unit 26 (that is, the comparison feature value 40 and the basic feature value 34 are Determination result information indicating the number of corresponding combinations) is generated.

  Then, the image correspondence determination unit 30 determines whether or not the comparison image and the basic image correspond, for example, based on the determination result information generated by the process exemplified in S206 (S207). Specifically, for example, the image correspondence determination unit 30 determines that the comparison image and the basic image correspond if the number of times or the number indicated by the determination result information is equal to or greater than a predetermined threshold. Note that the number of times the image correspondence determination unit 30 determines that the comparison feature value 40 does not correspond to the basic feature value 34 by the image element correspondence determination unit 26, or the comparison feature value 40 and the basic feature by the image element correspondence determination unit 26. Whether or not the basic image and the comparison image correspond may be determined based on both the number of times that the value 34 is determined to correspond and the number of times that the value 34 is determined not to correspond.

  The image correspondence determination unit 30 may generate image correspondence information indicating whether the basic image and the comparison image correspond to each other and output the image correspondence information to the storage unit 14. The image correspondence determining unit 30 compares an image corresponding image indicating whether or not the basic image and the comparative image correspond to each other, a comparison image corresponding to the basic feature value 34 and the comparative feature value 40 and a position in the basic image. An image may be generated and output to the UI unit 16 such as a display.

  Note that the flow of image correspondence determination processing is not limited to the flow of processing illustrated in the above-described processing example.

  Specifically, for example, the number of basic image elements and the number of comparison image elements may be different. Then, the determination result information generation unit 28 is associated with the positional relationship information indicating the relationship between the position of the basic image element in the basic region and the position of the comparison image element corresponding to the basic image element in the comparison region. Determination result information may be generated. Then, the image correspondence determination unit 30 may determine whether the basic image and the comparison image correspond to each other based on a plurality of pieces of determination result information having different positional relationships indicated by the associated positional relationship information. Specifically, for example, the processes exemplified in S205 to S206 may be repeated a plurality of times while changing the correspondence between the basic image element and the comparison image element.

  More specifically, for example, the number of comparison image elements in the comparison area (for example, q pieces in the vertical direction x q pieces in the horizontal direction) is larger than the number of basic image elements in the basic region (for example, p pieces in the vertical direction × p pieces in the horizontal direction). ), The control unit 12 sets the position (x, y) of the comparison image element associated with a specific basic image element (for example, the upper left basic image element) to 0 ≦ x ≦ q−p. , 0 ≦ y ≦ q−p, and the above-described processes exemplified in S205 to S206 may be repeatedly executed while shifting one by one.

At this time, in the process illustrated in S206 described above, the determination result information generation unit 28 generates a plurality of determination result information (for example, (q−p + 1) 2 ) associated with the positional relationship information indicating the value (x, y). May be generated.

Then, the image correspondence determination unit 30 may determine whether or not the basic image and the comparison image correspond based on a plurality of (for example, (q−p + 1) 2 ) determination result information. For example, the image correspondence determination unit 30 may determine that the comparison image and the basic image correspond if the maximum number of times or the number indicated by each of the plurality of pieces of determination result information is equal to or greater than a predetermined threshold. .

  Further, the image correspondence determination unit 30 performs comparison based on a normalized score calculated by, for example, a formula of (maximum value of the number of times and the number of times-average value of the number of times and the number of times) / (standard deviation of the number of times and the number of times). It may be determined whether or not the image corresponds to the basic image.

  In addition, for example, as illustrated in FIG. 8, the image correspondence determination unit 30 determines that the normalized score, the comparison feature value 40, and the basic feature value 34 correspond to each other (the comparison feature value 40 and the basic feature). If the combination of the value 34 and the number of combinations corresponding to the value 34 is within the predetermined correspondence recognition range 46, it may be determined that the comparison image and the basic image correspond to each other.

  In addition, when the image correspondence determination unit 30 determines that the basic image and the comparison image correspond to each other by the processing according to the above-described processing example, the basic image and the comparison image are further checked by executing a matching process between the basic image and the comparison image. It may be strictly determined whether or not the comparison image corresponds to the comparison image.

  The present invention is not limited to the above embodiment.

  For example, when a plurality of basic information is stored in the basic information storage unit 20, the processes exemplified in the above-described S201 to S207 are sequentially executed for each basic information, and the basic image corresponding to the comparison image is selected. You may do it.

  Further, for example, a printer or the like may print an image indicating the comparison feature information 42 and the determination result information on a paper medium from which the comparison image is acquired.

  The basic image and the comparison image may be a color image or a black and white image. Further, for example, the information processing apparatus 10 may be configured by a single casing or a plurality of casings. In addition, an external device connected to the information processing apparatus 10 via a network may store basic information. Further, the above-described processing may be executed using a basic image or a comparative image received from an external device connected to the information processing apparatus 10 via a network. In addition, the specific numerical value etc. which were described in this specification are illustrations, and are not limited to these numerical values.

It is a figure which shows an example of the hardware constitutions of the information processing apparatus which concerns on one Embodiment of this invention. It is a functional block diagram which shows an example of the function implement | achieved by the information processing apparatus which concerns on one Embodiment of this invention. It is a figure which shows an example of the data structure of basic | foundation characteristic information. It is a flowchart which shows an example of the flow of the basic information generation process performed with the information processing apparatus which concerns on this embodiment. It is a figure which shows an example of a binarized image. It is a figure which shows an example of the data structure of comparison feature information. It is a flowchart which shows an example of the flow of the image corresponding | compatible determination process performed with the information processing apparatus which concerns on this embodiment. It is a figure which shows an example of a corresponding | compatible recognition range.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 10 Information processing apparatus, 12 Control part, 14 Storage part, 16 User interface (UI) part, 18 Scanner part, 20 Basic information storage part, 22 Basic information generation part, 24 Comparison image acquisition part, 26 Image element corresponding | compatible determination part, 28 determination result information generation unit, 30 image correspondence determination unit, 32 basic feature information, 34 basic feature value, 36 basic position information, 38 binarized image, 40 comparison feature value, 42 comparison feature information, 44 comparison position information, 46 Applicable scope.

Claims (7)

  1. Basic image acquisition means for acquiring a basic image that is an image obtained by reading a paper medium;
    The included in the basic image, the image density is in the range that is commonly applied to all the pixels included in the basic image, an average value of the image density of the pixels included in the basic image outside including Scope A basic image conversion unit that determines a certain pixel as noise and converts the image density of the pixel determined as noise to an image density determined as not noise;
    Comparison image acquisition means for acquiring a comparison image to be compared with the basic image, which is an image obtained by reading a paper medium;
    For each pixel included in the comparison image, the image density is in the range that is commonly applied to all the pixels included in the comparison image, the average value including the range of image density of the pixels included in the comparison image Comparison image conversion means for determining pixels that are outside of the noise as noise, and converting the image density of the pixels determined as noise to an image density determined as not noise;
    The comparison feature value calculated by binarizing the comparison image element in the converted comparison image and the basic image element in the converted basic image corresponding to the comparison image element are binarized and calculated. Image element correspondence determining means for determining whether or not the basic feature value corresponds to,
    It is determined whether or not the basic image corresponds to the comparison image based on a determination result determined by the image element correspondence determination unit for each of at least one comparison image element having a position different from each other in the comparison image. Image correspondence determination means to
    An information processing apparatus comprising:
  2. Based on at least one of the number of times that the comparison feature value and the basic feature value are determined to correspond to each other, or the number of times that the image correspondence determination unit determines that the comparison feature value and the basic feature value do not correspond to each other, Determining whether or not the comparison image corresponds;
    The information processing apparatus according to claim 1.
  3. The image element correspondence determining means is based on the comparison feature value calculated based on the comparison image element and the basic image element whose position in the basic image corresponds to the position in the comparison image of the comparison image element. To determine whether or not the basic feature value calculated by
    The information processing apparatus according to claim 1 or 2.
  4. The image element correspondence determining means includes a comparison feature value calculated based on the comparison image element in a comparison area that is at least a part of the comparison image, and at least a part of the basic image. Determining whether a position in a certain basic region corresponds to a basic feature value calculated based on the basic image element corresponding to a position in the comparison region of the comparative image element;
    The image correspondence determining means determines whether the basic image and the comparison image are based on a determination result determined by the image element correspondence determining means for each of at least one comparison image element having a different position in the comparison region. Determine whether or not
    The information processing apparatus according to claim 1, wherein the information processing apparatus is an information processing apparatus.
  5. Judgment determined by the image element correspondence determination unit for each of at least one comparison image element having a position different from each other in the comparison area, which is associated with positional relationship information indicating a positional relation between the comparison area and the base area. A determination result information generating means for generating determination result information indicating the result;
    The image correspondence determining means determines whether the basic image and the comparison image correspond to each other based on a plurality of the determination result information having different positional relationships indicated by the associated positional relationship information;
    The information processing apparatus according to claim 4.
  6. The image correspondence determining means determines whether the basic image and the comparison image are based on a determination result determined by the image element correspondence determining means for each of at least one of the comparison image elements whose positions in the comparison image are different from each other. When it is determined that it corresponds, the basic image and the comparison image are further collated to determine whether or not the basic image and the comparison image correspond.
    The information processing apparatus according to claim 1, wherein the information processing apparatus is an information processing apparatus.
  7. Basic image acquisition means for acquiring a basic image which is an image obtained by reading a paper medium;
    The included in the basic image, the image density is in the range that is commonly applied to all the pixels included in the basic image, an average value of the image density of the pixels included in the basic image outside including Scope Basic image conversion means for determining a pixel as noise and converting the image density of the pixel determined as noise to an image density determined as not noise;
    Comparison image acquisition means for acquiring a comparison image that is an image obtained by reading a paper medium and compared with the basic image;
    For each pixel included in the comparison image, the image density is in the range that is commonly applied to all the pixels included in the comparison image, the average value including the range of image density of the pixels included in the comparison image Comparison image conversion means for determining a pixel that is outside the noise as noise and converting the image density of the pixel determined as noise to an image density determined as not noise,
    The comparison feature value calculated by binarizing the comparison image element in the converted comparison image and the basic image element in the converted basic image corresponding to the comparison image element are binarized and calculated. Image element correspondence determining means for determining whether or not the basic feature value corresponds to,
    It is determined whether or not the basic image corresponds to the comparison image based on a determination result determined by the image element correspondence determination unit for each of at least one comparison image element having a position different from each other in the comparison image. Image correspondence determination means,
    A program characterized by causing a computer to function.
JP2008316888A 2008-12-12 2008-12-12 Information processing apparatus and program Active JP5381069B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008316888A JP5381069B2 (en) 2008-12-12 2008-12-12 Information processing apparatus and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008316888A JP5381069B2 (en) 2008-12-12 2008-12-12 Information processing apparatus and program

Publications (2)

Publication Number Publication Date
JP2010140313A JP2010140313A (en) 2010-06-24
JP5381069B2 true JP5381069B2 (en) 2014-01-08

Family

ID=42350399

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008316888A Active JP5381069B2 (en) 2008-12-12 2008-12-12 Information processing apparatus and program

Country Status (1)

Country Link
JP (1) JP5381069B2 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01181168A (en) * 1988-01-14 1989-07-19 Fuji Photo Film Co Ltd Picture signal processing method
JPH03241477A (en) * 1990-02-20 1991-10-28 Toyota Motor Corp Correlation operation matching image processor
JPH05284349A (en) * 1992-03-31 1993-10-29 Canon Inc Unit and system for processing picture
JPH07263482A (en) * 1994-03-18 1995-10-13 Fujitsu Lsi Technol Kk Pattern matching method and manufacture of semiconductor device
JP2002279422A (en) * 2001-03-19 2002-09-27 Seiko Epson Corp Device, program and method for determining template matching
JP4103826B2 (en) * 2003-06-24 2008-06-18 富士ゼロックス株式会社 Authenticity determination method, apparatus and program

Also Published As

Publication number Publication date
JP2010140313A (en) 2010-06-24

Similar Documents

Publication Publication Date Title
KR100339691B1 (en) Apparatus for recognizing code and method therefor
JPWO2004084125A1 (en) Information input / output method using dot pattern
JP2006134336A (en) Mixed code, method and apparatus for generating mixed code, and recording medium
JP3345350B2 (en) Document image recognition apparatus, the method, and a recording medium
JP4867874B2 (en) Image processing program, image processing apparatus, and image processing method
EP1349371A2 (en) Image processing apparatus, image processing program and storage medium storing the program
EP1265188A1 (en) Pattern extraction apparatus and method
KR20080027959A (en) An autonomous handheld device having a drawing tool
JP2004140764A (en) Image processing device and method therefor
JP4755415B2 (en) Color two-dimensional code
US20040042670A1 (en) Image incoding apparatus, method and program
CN1434957A (en) Machine readable code and method and device of encoding and decoding same
JP2005012807A (en) Process for generating authenticable content
US7359568B2 (en) Image processing apparatus and image processing method
US9088745B2 (en) Apparatus, system, and method of inspecting image, and recording medium storing image inspection control program
JP2007336226A (en) Information processor, control method, and computer program
JP2007202132A (en) Apparatus and method for image processing, and computer program
JP2011128990A (en) Image processor and image processing method
JP5701182B2 (en) Image processing apparatus, image processing method, and computer program
JP2009225422A (en) Image encoding apparatus, image processing apparatus, and control method thereof
JP2006025129A (en) System and method for image processing
US8155443B2 (en) Image extracting apparatus, image extracting method and computer readable medium
US8208744B2 (en) Image processing apparatus capable of accurately and quickly determining character part included in image
KR101459766B1 (en) Method for recognizing a music score image with automatic accompaniment in a mobile device
JP5132517B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111124

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120907

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120918

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121102

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130319

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130517

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130903

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130916

R150 Certificate of patent or registration of utility model

Ref document number: 5381069

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150