US10607071B2 - Information processing apparatus, non-transitory computer readable recording medium, and information processing method - Google Patents

Information processing apparatus, non-transitory computer readable recording medium, and information processing method Download PDF

Info

Publication number
US10607071B2
US10607071B2 US15/976,316 US201815976316A US10607071B2 US 10607071 B2 US10607071 B2 US 10607071B2 US 201815976316 A US201815976316 A US 201815976316A US 10607071 B2 US10607071 B2 US 10607071B2
Authority
US
United States
Prior art keywords
name
image
handwriting
sheet
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US15/976,316
Other versions
US20180330155A1 (en
Inventor
Motoki Hiratsuka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Document Solutions Inc
Original Assignee
Kyocera Document Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Document Solutions Inc filed Critical Kyocera Document Solutions Inc
Publication of US20180330155A1 publication Critical patent/US20180330155A1/en
Assigned to Kyocera Document Solutions, Inc. reassignment Kyocera Document Solutions, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIRATSUKA, MOTOKI
Application granted granted Critical
Publication of US10607071B2 publication Critical patent/US10607071B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00422
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/33Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • G06K9/00161
    • G06K9/00456
    • G06K9/00899
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/333Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00328Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
    • H04N1/00331Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing optical character recognition
    • G06K9/00852
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/226Character recognition characterised by the type of writing of cursive writing

Definitions

  • the present disclosure relates to an information processing apparatus capable of obtaining a sheet-image obtained by scanning a sheet including handwritten-characters, and generating handwriting-information of the handwritten-characters in the sheet-image.
  • the present disclosure further relates to a non-transitory computer readable recording medium that records an information processing program, and an information processing method.
  • an information processing apparatus including:
  • an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
  • a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters
  • a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image
  • a non-transitory computer readable recording medium that records an information processing program executable by a processor of an information processing apparatus, the information processing program causing the processor of the information processing apparatus to operate as
  • an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
  • a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters
  • a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image
  • an information processing method including:
  • FIG. 1 shows a hardware configuration of an image forming apparatus according to an embodiment of the present disclosure
  • FIG. 2 shows a functional configuration of the image forming apparatus
  • FIG. 3 shows an operational flow (first time) of the image forming apparatus
  • FIG. 4 shows an operational flow (second time and thereafter) of the image forming apparatus.
  • an image forming apparatus Multifunction Peripheral, hereinafter simply referred to as MFP
  • MFP Multifunction Peripheral
  • FIG. 1 shows a hardware configuration of an image forming apparatus according to an embodiment of the present disclosure.
  • An MFP 10 includes a controller circuit 11 .
  • the controller circuit 11 includes a CPU (Central Processing Unit), i.e., a processor, a RAM (Random Access Memory), a ROM (Read Only Memory), i.e., a memory, dedicated hardware circuits, and the like and performs overall operational control of the MFP 10 .
  • a computer program that causes the MFP 10 to operate as the respective functional units (to be described later) is stored in a non-transitory computer readable recording medium such as a ROM.
  • the controller circuit 11 is connected to an image scanner 12 , an image processor 14 , an image memory 15 , an image forming device 16 , an operation device 17 , a storage device 18 , a communication controller device 13 , and the like.
  • the controller circuit 11 performs operational control of the respective devices connected thereto and sends/receives signals and data to/from those devices.
  • the controller circuit 11 controls drive and processing of mechanisms requisite for executing operational control of functions such as a scanner function, a printing function, and a copy function.
  • the image scanner 12 reads an image from a document.
  • the image processor 14 carries out image processing as necessary on image data of an image read by the image scanner 12 .
  • the image processor 14 corrects shading of an image read by the image scanner 12 and carries out other image processing to improve the quality of the image to be formed.
  • the image memory 15 includes an area that temporarily stores data of a document image read by the image scanner 12 or data to be printed by the image forming device 16 .
  • the image forming device 16 (printer) forms an image of image data and the like read by the image scanner 12 .
  • the operation device 17 includes a touch panel device and an operation key device that accept user's instructions on various operations and processing executable by the MFP 1 .
  • the touch panel device includes a display device 17 a such as an LCD (Liquid Crystal Display) and an organic EL (Electroluminescence) display including a touch panel.
  • the communication controller device 13 (communication device) is an interface used for connecting to the network N.
  • the storage device 18 is a large-volume storage device such as an HDD (Hard Disk Drive) that stores a document image read by the image scanner 12 , and the like.
  • the storage device 18 may further include a detachably-connected mobile storage medium (for example, a USB (Universal Serial Bus) memory) and its interface.
  • a detachably-connected mobile storage medium for example, a USB (Universal Serial Bus) memory
  • FIG. 2 shows a functional configuration of the image forming apparatus.
  • the CPU (processor) of the controller 11 of the MFP 10 loads an information processing program recorded in the ROM (memory) in the RAM and executes the program to thereby operate as the functional blocks, i.e., the image obtaining unit 101 , the character recognizing unit 102 , the handwriting-information generating unit 103 , the name-field determining unit 104 , the impersonation determining unit 105 , the writer determining unit 106 , and the image generating unit 107 .
  • the image obtaining unit 101 obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and a handwritten-characters written in an area other than the name-field.
  • the character recognizing unit 102 recognizes the handwritten-characters in the sheet-image.
  • the handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters recognized by the character recognizing unit 102 .
  • the name-field determining unit 104 determines whether or not a name is written in the name-field in the sheet-image.
  • the impersonation determining unit 105 determines whether or not the handwriting-information generated by the handwriting-information generating unit 103 is stored in the handwriting-information database 112 in association with the name handwritten in the name-field. Where the impersonation determining unit 105 determines that the handwriting-information generated by the handwriting-information generating unit 103 is not stored in the handwriting-information database 112 in association with the name handwritten in the name-field, the impersonation determining unit 105 extracts a name stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 .
  • the handwriting-information database 112 stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other.
  • the writer determining unit 106 extracts, from the handwriting-information database 112 , one or more names and handwriting-informations in association with a particular attribute, and generates a searchable-table.
  • the writer determining unit 106 selects one name stored in the searchable-table in association with the handwriting-information generated by the handwriting-information generating unit 103 .
  • the writer determining unit 106 where the searchable-table stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103 , treats the plurality of names as candidates, excludes a name handwritten in a name-field of another sheet-image from the candidates, and selects one non-excluded and remaining name.
  • the image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106 , and combines the name-image and the sheet-image to generate a combined-image.
  • FIG. 3 shows an operational flow (first time) of the image forming apparatus.
  • the image scanner 12 scans a sheet set on a feeder or the like, and generates a sheet-image.
  • the “sheet” includes a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field. A name may be handwritten or may (intentionally or unintentionally) not be written in the “name-field”.
  • the “area other than the name-field” is, for example, an answer-field in which an answer is handwritten. Examples of this kind of “sheet” include answer sheets for examinations of schools, cram schools, and the like, and questionnaire sheets.
  • the image obtaining unit 101 obtains a sheet-image (strictly speaking, image data) generated by the image scanner 12 (Step S 101 ).
  • the image obtaining unit 101 supplies the obtained sheet-image to the character recognizing unit 102 .
  • the character recognizing unit 102 obtains a sheet-image from the image obtaining unit 101 .
  • the character recognizing unit 102 recognizes handwritten-characters in the sheet-image (Step S 102 ).
  • the “handwritten-characters” include characters (name) handwritten in the name-field, characters (attribute) handwritten in an attribute-field, and characters (answers) handwritten in the area other than the name-field.
  • the character recognizing unit 102 detects edges and thereby extracts the handwritten-characters.
  • the character recognizing unit 102 refers to the OCR (Optical Character Recognition) database 111 , and identifies the extracted handwritten-characters.
  • OCR Optical Character Recognition
  • an image pattern of a character and a character code are registered in association with each other one-to-one in the OCR database 111 .
  • the character recognizing unit 102 searches the OCR database 111 for the image pattern indicating an extracted character, and obtains the character code in association with the retrieved image pattern.
  • the character recognizing unit 102 obtains the character codes of all the handwritten-characters.
  • the character recognizing unit 102 combines the character codes of the characters (name) handwritten in the name-field, and thereby recognizes the name.
  • the character recognizing unit 102 combines the character codes of the characters (attribute) handwritten in the attribute-field, and thereby recognizes the attribute.
  • the “attribute” is information indicating what a person belongs to such as a school name, a school year, and a class.
  • the handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the handwritten-characters recognized by the character recognizing unit 102 (Step S 103 ).
  • the “handwriting-information” relates to denseness (thickness, darkness) or weakness (thinness, paleness) of start-of-writing, roundness of curves, angles of corners, denseness (thickness, darkness) or weakness (thinness, paleness) of end-of-writing, and the like of each character.
  • the handwriting-information generating unit 103 stores the generated handwriting-information of each character, and the name and the attribute recognized by the character recognizing unit 102 in the handwriting-information database 112 in association with each other.
  • FIG. 4 shows an operational flow (second time and thereafter) of the image forming apparatus.
  • the image obtaining unit 101 obtains a sheet-image (strictly speaking, image data) generated by the image scanner 12 (Step S 201 , similar to Step S 101 of FIG. 3 ).
  • the image obtaining unit 101 supplies the obtained sheet-image to the character recognizing unit 102 .
  • the character recognizing unit 102 obtains a sheet-image from the image obtaining unit 101 .
  • the character recognizing unit 102 recognizes handwritten-characters in the sheet-image (Step S 202 , similar to Step S 102 of FIG. 3 ).
  • the character recognizing unit 102 combines the character codes of the characters (name) handwritten in the name-field, and thereby recognizes the name.
  • the character recognizing unit 102 combines the character codes of the characters (attribute) handwritten in the attribute-field, and thereby recognizes the attribute.
  • the handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the handwritten-characters recognized by the character recognizing unit 102 (Step S 203 , similar to Step S 103 of FIG. 3 ).
  • the name-field determining unit 104 obtains the name recognized by the character recognizing unit 102 , and determines whether or not a name is written in the name-field of the sheet-image (Step S 204 ).
  • Step S 204 determines that a name is written in the name-field of the sheet-image
  • Step S 204 determines that no name is written (typically, name-field is blank) in the name-field of the sheet-image
  • the impersonation determining unit 105 determines whether or not the handwriting-information generated by the handwriting-information generating unit 103 is stored in the handwriting-information database 112 in association with the name (name handwritten in name-field) recognized by the character recognizing unit 102 (Step S 205 ).
  • the handwriting-information is not in association with the name handwritten in the name-field, somebody may possibly have “impersonated” the person of this name, and handwritten the name and answers on this sheet. To the contrary, if the handwriting-information is in association with the name handwritten in the name-field, not the “impersonation”, but the person of this name by himself may be highly likely to have handwritten the name and answers on this sheet.
  • the impersonation determining unit 105 determines that the handwriting-information is stored in the handwriting-information database 112 in association with the name handwritten in the name-field (not likely to be “impersonation”) (Step S 206 , YES). In this case, the impersonation determining unit 105 supplies the handwriting-information generated by the handwriting-information generating unit 103 to the handwriting-information database 112 in association with the name (name handwritten in name-field) recognized by the character recognizing unit 102 to thereby additionally store the handwriting-information and update the handwriting-information database 112 (Step S 207 ). In this way, by additionally storing the handwriting-information to the handwriting-information database 112 and updating the handwriting-information database 112 , it is possible to identify a person on a basis of handwriting-information more and more accurately.
  • the impersonation determining unit 105 determines that the handwriting-information is not stored in the handwriting-information database 112 in association with the name handwritten in the name-field (likely to be “impersonation”) (Step S 206 , NO). In this case, the impersonation determining unit 105 determines whether or not a name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 208 ).
  • the impersonation determining unit 105 determines that a name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 209 , YES)
  • the impersonation determining unit 105 displays this name (name of a person who may possibly have “impersonated”) on the display device 17 a , and alerts a user (marker, etc.) (Step S 210 ).
  • the impersonation determining unit 105 determines that no name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 209 , NO)
  • the impersonation determining unit 105 displays a message (suspicious person is unidentified) on the display device 17 a , and alerts a user (marker, etc.) (Step S 211 ).
  • the writer determining unit 106 extracts one or more names and handwriting-informations in association with a particular attribute from the handwriting-information database 112 , and generates a searchable-table (Step S 212 ).
  • the “particular attribute” is the attribute (class, etc.) of a person identified by a name to be written in the name-field (in which no name is written), and is specified on a basis of operations by a user (marker, etc.).
  • the “searchable-table” is a table indicating the names and handwriting-informations of a plurality of persons who belong to the “particular attribute” (one class, etc.).
  • the writer determining unit 106 determines whether or not the generated searchable-table stores the handwriting-information generated by the handwriting-information generating unit 103 (Step S 213 ). Where the writer determining unit 106 determines that the generated searchable-table does not store the handwriting-information generated by the handwriting-information generating unit 103 (Step S 213 , NO), the writer determining unit 106 displays a message (suspicious person is unidentified) on the display device 17 a , and alerts a user (marker, etc.) (Step S 211 ).
  • the writer determining unit 106 determines that the generated searchable-table stores the handwriting-information generated by the handwriting-information generating unit 103 (Step S 213 , YES). In this case, the writer determining unit 106 determines whether the generated searchable-table stores a plurality of names or only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 214 ).
  • the writer determining unit 106 determines that the generated searchable-table stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 , the writer determining unit 106 selects this one name (Step S 214 , YES). In this case, the person of the selected name may be highly likely to be a writer. So the writer determining unit 106 supplies the handwriting-information generated by the handwriting-information generating unit 103 to the handwriting-information database 112 in association with the name to thereby additionally store the handwriting-information and update the handwriting-information database 112 (Step S 215 ).
  • the writer determining unit 106 supplies the selected name (name of person highly likely to be writer) to the image generating unit 107 .
  • the image generating unit 107 obtains the selected name (name of person highly likely to be writer) from the writer determining unit 106 .
  • the image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106 .
  • the “name-image” is an image of a text indicating the name.
  • the image generating unit 107 combines the generated name-image and the sheet-image obtained by the image obtaining unit 101 to thereby generate a combined-image (Step S 216 ).
  • the image generating unit 107 combines the generated name-image and the name-field in the sheet-image obtained by the image obtaining unit 101 to thereby generate a combined-image.
  • the image generating unit 107 generates a combined-image, in which a name is written in the blank name-field.
  • the image generating unit 107 outputs (prints, saves, displays, sends, etc.) the generated combined-image (Step S 217 ).
  • the writer determining unit 106 determines that the generated searchable-table stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 214 , NO). In this case, the writer determining unit 106 suspends identification of a writer, and treats the plurality of names as candidates for a writer (Step S 218 ).
  • the controller 11 of the MFP 10 executes the process of Steps S 201 to S 207 for the other sheet-images.
  • the writer determining unit 106 deletes (excludes), from the searchable-table, the names and handwriting-informations (Step S 207 ) additionally stored in the handwriting-information database 112 and updated by the impersonation determining unit 105 to thereby update the searchable-table (Step S 219 ).
  • the writer determining unit 106 excludes a name and handwriting-information, which cannot be a candidate for a writer, from the searchable-table one by one to thereby narrow down the candidates for a writer.
  • the writer determining unit 106 determines whether or not the updated searchable-table (in which candidates are narrowed down) stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 220 ).
  • the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (i.e., there is only one non-excluded and remaining name)
  • the writer determining unit 106 selects the one name (Step S 220 , YES).
  • the person of the selected name may be highly likely to be a writer. So the writer determining unit 106 supplies the selected name (name of person highly likely to be writer) to the image generating unit 107 .
  • the image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106 , generates a combined-image (Step S 216 ), and outputs the generated combined-image (Step S 217 ).
  • the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores no name at all in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 220 , NO, and Step S 221 , NO)
  • the writer determining unit 106 displays a message (suspicious person is unidentified) on the display device 17 a , and alerts a user (marker, etc.) (Step S 211 ).
  • the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S 220 , NO, and Step S 221 , YES)
  • the writer determining unit 106 displays the plurality of names as candidates on the display device 17 a , and advises a user (marker, etc.) to specify whether or not a combined-image including any one name is to be generated (Step S 222 ).
  • the writer determining unit 106 determines that a combined-image including a particular name (specified by user) is to be generated on a basis of a particular operation input in the operation device 17 by the user (Step S 223 , YES)
  • the writer determining unit 106 generates a name-image indicating the name, generates a combined-image (Step S 216 ), and outputs the generated combined-image (Step S 217 ).
  • the MFP 10 executes all the processes.
  • an information processing apparatus may obtain sheet-images from an image scanner or an MFP, and may execute all the processes (not shown).
  • the information processing apparatus may be a personal computer used by a user (marker, etc.) and connected to the image scanner or the MFP via an intranet.
  • the information processing apparatus may be a so-called server apparatus connected to the image scanner or the MFP via the Internet.
  • an external server apparatus may store the handwriting-information database 112 in a memory, and an information processing apparatus may obtain the handwriting-information database 112 via a communication device and may execute all the processes (not shown).
  • the writer determining unit 106 determines a writer on a basis of handwriting-information indicating characteristics of each handwritten-character in the sheet-image generated by the handwriting-information generating unit 103 .
  • the writer determining unit 106 can accurately determine a writer on a basis of handwriting-information. If there are a plurality of candidates for a writer, the writer determining unit 106 narrows down the candidates for a writer in association with a particular attribute, and can thereby determine a writer accurately.
  • the impersonation determining unit 105 determines whether somebody has “impersonated” the person of that name and has handwritten the name, answers, and the like on this sheet, or not “impersonation” but the person of that name by himself has handwritten the name, answers, and the like on this sheet, on a basis of handwriting-information of characteristics of each handwritten-character in the sheet-image generated by the handwriting-information generating unit 103 .
  • the impersonation determining unit 105 can determine presence/absence of possibility of “impersonation” more accurately.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Character Discrimination (AREA)
  • Editing Of Facsimile Originals (AREA)
  • Collating Specific Patterns (AREA)
  • Processing Or Creating Images (AREA)
  • Facsimiles In General (AREA)

Abstract

An information processing method includes: obtaining a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field; recognizing the handwritten-characters in the sheet-image; generating handwriting-information indicating characteristics of each character of the recognized handwritten-characters; determining whether or not a name is written in the name-field in the sheet-image; where determining that no name is written, extracting, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generating a table; selecting one name stored in the table in association with the generated handwriting-information; generating a name-image indicating the selected name; and combining the name-image and the sheet-image to generate a combined-image.

Description

INCORPORATION BY REFERENCE
This application claims the benefit of Japanese Priority Patent Application JP 2017-096180 filed May 15, 2017, the entire contents of which are incorporated herein by reference.
FIELD
The present disclosure relates to an information processing apparatus capable of obtaining a sheet-image obtained by scanning a sheet including handwritten-characters, and generating handwriting-information of the handwritten-characters in the sheet-image. The present disclosure further relates to a non-transitory computer readable recording medium that records an information processing program, and an information processing method.
BACKGROUND
There is known a technique of obtaining a sheet-image obtained by scanning a sheet including handwritten-characters, and identifying a writer on a basis of handwriting-information of the handwritten-characters in the sheet-image.
It is desirable to identify a writer more and more accurately in the technique of obtaining a sheet-image obtained by scanning a sheet including handwritten-characters, and identifying a writer on a basis of handwriting-information of the handwritten-characters in the sheet-image.
SUMMARY
According to an embodiment of the present disclosure, there is provided an information processing apparatus, including:
a processor that operates as
an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
a character recognizing unit that recognizes the handwritten-characters in the sheet-image,
a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters,
a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image,
a writer determining unit that,
    • where the name-field determining unit determines that no name is written,
    • extracts, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generates a table, and
    • selects one name stored in the table in association with the generated handwriting-information, and
an image generating unit that
    • generates a name-image indicating the selected name, and
    • combines the name-image and the sheet-image to generate a combined-image.
According to an embodiment of the present disclosure, there is provided a non-transitory computer readable recording medium that records an information processing program executable by a processor of an information processing apparatus, the information processing program causing the processor of the information processing apparatus to operate as
an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
a character recognizing unit that recognizes the handwritten-characters in the sheet-image,
a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters,
a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image,
a writer determining unit that,
    • where the name-field determining unit determines that no name is written,
    • extracts, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generates a table, and
    • selects one name stored in the table in association with the generated handwriting-information, and
an image generating unit that
    • generates a name-image indicating the selected name, and
    • combines the name-image and the sheet-image to generate a combined-image.
According to an embodiment of the present disclosure, there is provided an information processing method, including:
obtaining a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field;
recognizing the handwritten-characters in the sheet-image;
generating handwriting-information indicating characteristics of each character of the recognized handwritten-characters;
determining whether or not a name is written in the name-field in the sheet-image;
where determining that no name is written,
extracting, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generating a table;
selecting one name stored in the table in association with the generated handwriting-information;
generating a name-image indicating the selected name; and
combining the name-image and the sheet-image to generate a combined-image.
These and other objects, features and advantages of the present disclosure will become more apparent in light of the following detailed description of best mode embodiments thereof, as illustrated in the accompanying drawings.
BRIEF DESCRIPTION OF FIGURES
FIG. 1 shows a hardware configuration of an image forming apparatus according to an embodiment of the present disclosure;
FIG. 2 shows a functional configuration of the image forming apparatus;
FIG. 3 shows an operational flow (first time) of the image forming apparatus; and
FIG. 4 shows an operational flow (second time and thereafter) of the image forming apparatus.
DETAILED DESCRIPTION
Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings. In the present embodiment, an image forming apparatus (Multifunction Peripheral, hereinafter simply referred to as MFP) will be described as an information processing apparatus.
1. Hardware Configuration of Image Forming Apparatus
FIG. 1 shows a hardware configuration of an image forming apparatus according to an embodiment of the present disclosure.
An MFP 10 includes a controller circuit 11. The controller circuit 11 includes a CPU (Central Processing Unit), i.e., a processor, a RAM (Random Access Memory), a ROM (Read Only Memory), i.e., a memory, dedicated hardware circuits, and the like and performs overall operational control of the MFP 10. A computer program that causes the MFP 10 to operate as the respective functional units (to be described later) is stored in a non-transitory computer readable recording medium such as a ROM.
The controller circuit 11 is connected to an image scanner 12, an image processor 14, an image memory 15, an image forming device 16, an operation device 17, a storage device 18, a communication controller device 13, and the like. The controller circuit 11 performs operational control of the respective devices connected thereto and sends/receives signals and data to/from those devices.
According to job execution instructions input by a user via the operation device 17 or a personal computer (not shown) connected to a network, the controller circuit 11 controls drive and processing of mechanisms requisite for executing operational control of functions such as a scanner function, a printing function, and a copy function.
The image scanner 12 reads an image from a document.
The image processor 14 carries out image processing as necessary on image data of an image read by the image scanner 12. For example, the image processor 14 corrects shading of an image read by the image scanner 12 and carries out other image processing to improve the quality of the image to be formed.
The image memory 15 includes an area that temporarily stores data of a document image read by the image scanner 12 or data to be printed by the image forming device 16.
The image forming device 16 (printer) forms an image of image data and the like read by the image scanner 12.
The operation device 17 includes a touch panel device and an operation key device that accept user's instructions on various operations and processing executable by the MFP 1. The touch panel device includes a display device 17 a such as an LCD (Liquid Crystal Display) and an organic EL (Electroluminescence) display including a touch panel.
The communication controller device 13 (communication device) is an interface used for connecting to the network N.
The storage device 18 is a large-volume storage device such as an HDD (Hard Disk Drive) that stores a document image read by the image scanner 12, and the like. The storage device 18 may further include a detachably-connected mobile storage medium (for example, a USB (Universal Serial Bus) memory) and its interface.
2. Functional Configuration of Image Forming Apparatus
FIG. 2 shows a functional configuration of the image forming apparatus.
The CPU (processor) of the controller 11 of the MFP 10 loads an information processing program recorded in the ROM (memory) in the RAM and executes the program to thereby operate as the functional blocks, i.e., the image obtaining unit 101, the character recognizing unit 102, the handwriting-information generating unit 103, the name-field determining unit 104, the impersonation determining unit 105, the writer determining unit 106, and the image generating unit 107.
The image obtaining unit 101 obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and a handwritten-characters written in an area other than the name-field.
The character recognizing unit 102 recognizes the handwritten-characters in the sheet-image.
The handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters recognized by the character recognizing unit 102.
The name-field determining unit 104 determines whether or not a name is written in the name-field in the sheet-image.
Where the name-field determining unit 104 determines that a name is written, the impersonation determining unit 105 determines whether or not the handwriting-information generated by the handwriting-information generating unit 103 is stored in the handwriting-information database 112 in association with the name handwritten in the name-field. Where the impersonation determining unit 105 determines that the handwriting-information generated by the handwriting-information generating unit 103 is not stored in the handwriting-information database 112 in association with the name handwritten in the name-field, the impersonation determining unit 105 extracts a name stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103.
The handwriting-information database 112 stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other.
Where the name-field determining unit 104 determines that no name is written, the writer determining unit 106 extracts, from the handwriting-information database 112, one or more names and handwriting-informations in association with a particular attribute, and generates a searchable-table. The writer determining unit 106 selects one name stored in the searchable-table in association with the handwriting-information generated by the handwriting-information generating unit 103. The writer determining unit 106, where the searchable-table stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103, treats the plurality of names as candidates, excludes a name handwritten in a name-field of another sheet-image from the candidates, and selects one non-excluded and remaining name.
The image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106, and combines the name-image and the sheet-image to generate a combined-image.
3. Operational Flow of Image Forming Apparatus
FIG. 3 shows an operational flow (first time) of the image forming apparatus.
The image scanner 12 scans a sheet set on a feeder or the like, and generates a sheet-image. The “sheet” includes a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field. A name may be handwritten or may (intentionally or unintentionally) not be written in the “name-field”. The “area other than the name-field” is, for example, an answer-field in which an answer is handwritten. Examples of this kind of “sheet” include answer sheets for examinations of schools, cram schools, and the like, and questionnaire sheets.
The image obtaining unit 101 obtains a sheet-image (strictly speaking, image data) generated by the image scanner 12 (Step S101). The image obtaining unit 101 supplies the obtained sheet-image to the character recognizing unit 102.
The character recognizing unit 102 obtains a sheet-image from the image obtaining unit 101. The character recognizing unit 102 recognizes handwritten-characters in the sheet-image (Step S102). The “handwritten-characters” include characters (name) handwritten in the name-field, characters (attribute) handwritten in an attribute-field, and characters (answers) handwritten in the area other than the name-field. In detail, the character recognizing unit 102 detects edges and thereby extracts the handwritten-characters. The character recognizing unit 102 refers to the OCR (Optical Character Recognition) database 111, and identifies the extracted handwritten-characters. In detail, an image pattern of a character and a character code are registered in association with each other one-to-one in the OCR database 111. The character recognizing unit 102 searches the OCR database 111 for the image pattern indicating an extracted character, and obtains the character code in association with the retrieved image pattern. The character recognizing unit 102 obtains the character codes of all the handwritten-characters. The character recognizing unit 102 combines the character codes of the characters (name) handwritten in the name-field, and thereby recognizes the name. The character recognizing unit 102 combines the character codes of the characters (attribute) handwritten in the attribute-field, and thereby recognizes the attribute. The “attribute” is information indicating what a person belongs to such as a school name, a school year, and a class.
The handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the handwritten-characters recognized by the character recognizing unit 102 (Step S103). For example, the “handwriting-information” relates to denseness (thickness, darkness) or weakness (thinness, paleness) of start-of-writing, roundness of curves, angles of corners, denseness (thickness, darkness) or weakness (thinness, paleness) of end-of-writing, and the like of each character. The handwriting-information generating unit 103 stores the generated handwriting-information of each character, and the name and the attribute recognized by the character recognizing unit 102 in the handwriting-information database 112 in association with each other.
FIG. 4 shows an operational flow (second time and thereafter) of the image forming apparatus.
The image obtaining unit 101 obtains a sheet-image (strictly speaking, image data) generated by the image scanner 12 (Step S201, similar to Step S101 of FIG. 3). The image obtaining unit 101 supplies the obtained sheet-image to the character recognizing unit 102.
The character recognizing unit 102 obtains a sheet-image from the image obtaining unit 101. The character recognizing unit 102 recognizes handwritten-characters in the sheet-image (Step S202, similar to Step S102 of FIG. 3). The character recognizing unit 102 combines the character codes of the characters (name) handwritten in the name-field, and thereby recognizes the name. The character recognizing unit 102 combines the character codes of the characters (attribute) handwritten in the attribute-field, and thereby recognizes the attribute.
The handwriting-information generating unit 103 generates handwriting-information indicating characteristics of each character of the handwritten-characters recognized by the character recognizing unit 102 (Step S203, similar to Step S103 of FIG. 3).
The name-field determining unit 104 obtains the name recognized by the character recognizing unit 102, and determines whether or not a name is written in the name-field of the sheet-image (Step S204). Hereinafter, (1) a case where the name-field determining unit 104 determines that a name is written in the name-field of the sheet-image (Step S204, YES) and (2) a case where the name-field determining unit 104 determines that no name is written (typically, name-field is blank) in the name-field of the sheet-image (Step S204, NO) will be described separately.
(1) Where a name is written in the name-field of the sheet-image (Step S204, YES):
The impersonation determining unit 105 determines whether or not the handwriting-information generated by the handwriting-information generating unit 103 is stored in the handwriting-information database 112 in association with the name (name handwritten in name-field) recognized by the character recognizing unit 102 (Step S205).
If the handwriting-information is not in association with the name handwritten in the name-field, somebody may possibly have “impersonated” the person of this name, and handwritten the name and answers on this sheet. To the contrary, if the handwriting-information is in association with the name handwritten in the name-field, not the “impersonation”, but the person of this name by himself may be highly likely to have handwritten the name and answers on this sheet.
The impersonation determining unit 105 determines that the handwriting-information is stored in the handwriting-information database 112 in association with the name handwritten in the name-field (not likely to be “impersonation”) (Step S206, YES). In this case, the impersonation determining unit 105 supplies the handwriting-information generated by the handwriting-information generating unit 103 to the handwriting-information database 112 in association with the name (name handwritten in name-field) recognized by the character recognizing unit 102 to thereby additionally store the handwriting-information and update the handwriting-information database 112 (Step S207). In this way, by additionally storing the handwriting-information to the handwriting-information database 112 and updating the handwriting-information database 112, it is possible to identify a person on a basis of handwriting-information more and more accurately.
Meanwhile, the impersonation determining unit 105 determines that the handwriting-information is not stored in the handwriting-information database 112 in association with the name handwritten in the name-field (likely to be “impersonation”) (Step S206, NO). In this case, the impersonation determining unit 105 determines whether or not a name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S208).
Where the impersonation determining unit 105 determines that a name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S209, YES), the impersonation determining unit 105 displays this name (name of a person who may possibly have “impersonated”) on the display device 17 a, and alerts a user (marker, etc.) (Step S210).
Meanwhile, where the impersonation determining unit 105 determines that no name is stored in the handwriting-information database 112 in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S209, NO), the impersonation determining unit 105 displays a message (suspicious person is unidentified) on the display device 17 a, and alerts a user (marker, etc.) (Step S211).
(2) Where no name is written in the name-field of the sheet-image (Step S204, NO):
The writer determining unit 106 extracts one or more names and handwriting-informations in association with a particular attribute from the handwriting-information database 112, and generates a searchable-table (Step S212). The “particular attribute” is the attribute (class, etc.) of a person identified by a name to be written in the name-field (in which no name is written), and is specified on a basis of operations by a user (marker, etc.). In other words, the “searchable-table” is a table indicating the names and handwriting-informations of a plurality of persons who belong to the “particular attribute” (one class, etc.).
The writer determining unit 106 determines whether or not the generated searchable-table stores the handwriting-information generated by the handwriting-information generating unit 103 (Step S213). Where the writer determining unit 106 determines that the generated searchable-table does not store the handwriting-information generated by the handwriting-information generating unit 103 (Step S213, NO), the writer determining unit 106 displays a message (suspicious person is unidentified) on the display device 17 a, and alerts a user (marker, etc.) (Step S211).
Meanwhile, the writer determining unit 106 determines that the generated searchable-table stores the handwriting-information generated by the handwriting-information generating unit 103 (Step S213, YES). In this case, the writer determining unit 106 determines whether the generated searchable-table stores a plurality of names or only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S214).
Where the writer determining unit 106 determines that the generated searchable-table stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103, the writer determining unit 106 selects this one name (Step S214, YES). In this case, the person of the selected name may be highly likely to be a writer. So the writer determining unit 106 supplies the handwriting-information generated by the handwriting-information generating unit 103 to the handwriting-information database 112 in association with the name to thereby additionally store the handwriting-information and update the handwriting-information database 112 (Step S215). In this way, by additionally storing the handwriting-information to the handwriting-information database 112 and updating the handwriting-information database 112, it is possible to identify a person on a basis of handwriting-information more and more accurately. The writer determining unit 106 supplies the selected name (name of person highly likely to be writer) to the image generating unit 107.
The image generating unit 107 obtains the selected name (name of person highly likely to be writer) from the writer determining unit 106. The image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106. Typically, the “name-image” is an image of a text indicating the name. The image generating unit 107 combines the generated name-image and the sheet-image obtained by the image obtaining unit 101 to thereby generate a combined-image (Step S216). For example, the image generating unit 107 combines the generated name-image and the name-field in the sheet-image obtained by the image obtaining unit 101 to thereby generate a combined-image. In other words, the image generating unit 107 generates a combined-image, in which a name is written in the blank name-field. The image generating unit 107 outputs (prints, saves, displays, sends, etc.) the generated combined-image (Step S217).
Meanwhile, the writer determining unit 106 determines that the generated searchable-table stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S214, NO). In this case, the writer determining unit 106 suspends identification of a writer, and treats the plurality of names as candidates for a writer (Step S218).
Then, the controller 11 of the MFP 10 executes the process of Steps S201 to S207 for the other sheet-images. The writer determining unit 106 deletes (excludes), from the searchable-table, the names and handwriting-informations (Step S207) additionally stored in the handwriting-information database 112 and updated by the impersonation determining unit 105 to thereby update the searchable-table (Step S219).
In other words, the writer determining unit 106 excludes a name and handwriting-information, which cannot be a candidate for a writer, from the searchable-table one by one to thereby narrow down the candidates for a writer. The writer determining unit 106 determines whether or not the updated searchable-table (in which candidates are narrowed down) stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S220).
Where the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores only one name in association with the handwriting-information generated by the handwriting-information generating unit 103 (i.e., there is only one non-excluded and remaining name), the writer determining unit 106 selects the one name (Step S220, YES). In this case, the person of the selected name may be highly likely to be a writer. So the writer determining unit 106 supplies the selected name (name of person highly likely to be writer) to the image generating unit 107. The image generating unit 107 generates a name-image indicating the name selected by the writer determining unit 106, generates a combined-image (Step S216), and outputs the generated combined-image (Step S217).
Meanwhile, where the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores no name at all in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S220, NO, and Step S221, NO), the writer determining unit 106 displays a message (suspicious person is unidentified) on the display device 17 a, and alerts a user (marker, etc.) (Step S211).
Meanwhile, where the writer determining unit 106 determines that the updated searchable-table (in which candidates are narrowed down) stores a plurality of names in association with the handwriting-information generated by the handwriting-information generating unit 103 (Step S220, NO, and Step S221, YES), the writer determining unit 106 displays the plurality of names as candidates on the display device 17 a, and advises a user (marker, etc.) to specify whether or not a combined-image including any one name is to be generated (Step S222).
Where the writer determining unit 106 determines that a combined-image including a particular name (specified by user) is to be generated on a basis of a particular operation input in the operation device 17 by the user (Step S223, YES), the writer determining unit 106 generates a name-image indicating the name, generates a combined-image (Step S216), and outputs the generated combined-image (Step S217).
4. Modification Examples
In the aforementioned embodiment, the MFP 10 executes all the processes. Instead, an information processing apparatus may obtain sheet-images from an image scanner or an MFP, and may execute all the processes (not shown). The information processing apparatus may be a personal computer used by a user (marker, etc.) and connected to the image scanner or the MFP via an intranet. Alternatively, the information processing apparatus may be a so-called server apparatus connected to the image scanner or the MFP via the Internet. Alternatively, an external server apparatus may store the handwriting-information database 112 in a memory, and an information processing apparatus may obtain the handwriting-information database 112 via a communication device and may execute all the processes (not shown).
5. Conclusion
According to the present embodiment, where the name-field determining unit 104 determines that no name is written in a name-field in a sheet-image, the writer determining unit 106 determines a writer on a basis of handwriting-information indicating characteristics of each handwritten-character in the sheet-image generated by the handwriting-information generating unit 103. As a result, if no name is written in a name-field, the writer determining unit 106 can accurately determine a writer on a basis of handwriting-information. If there are a plurality of candidates for a writer, the writer determining unit 106 narrows down the candidates for a writer in association with a particular attribute, and can thereby determine a writer accurately.
According to the present embodiment, where the name-field determining unit 104 determines that a name is written in a name-field in a sheet-image, the impersonation determining unit 105 determines whether somebody has “impersonated” the person of that name and has handwritten the name, answers, and the like on this sheet, or not “impersonation” but the person of that name by himself has handwritten the name, answers, and the like on this sheet, on a basis of handwriting-information of characteristics of each handwritten-character in the sheet-image generated by the handwriting-information generating unit 103. As a result, where a name is written in a name-field, the impersonation determining unit 105 can determine presence/absence of possibility of “impersonation” more accurately.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (12)

What is claimed is:
1. An information processing apparatus, comprising:
a processor that operates as
an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
a character recognizing unit that recognizes the handwritten-characters in the sheet-image,
a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters,
a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image,
a writer determining unit that,
where the name-field determining unit determines that no name is written,
extracts, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generates a table, and
selects one name stored in the table in association with the generated handwriting-information, and
an image generating unit that
generates a name-image indicating the selected name, and
combines the name-image and the sheet-image to generate a combined-image.
2. The information processing apparatus according to claim 1, wherein the handwriting-information generating unit supplies the generated handwriting-information in association with the selected name to the database.
3. The information processing apparatus according to claim 1, wherein
the image generating unit combines the name-image and the name-field in the sheet-image to generate the combined-image.
4. The information processing apparatus according to claim 1, wherein
the writer determining unit,
where the table stores a plurality of names in association with the generated handwriting-information,
treats the plurality of names as candidates,
excludes a name handwritten in a name-field of another sheet-image from the candidates, and
selects one non-excluded and remaining name.
5. The information processing apparatus according to claim 1, wherein
the processor further operates as
an impersonation determining unit that,
where the name-field determining unit determines that a name is written,
determines whether or not the generated handwriting-information is stored in the database in association with the name handwritten in the name-field, and
where the impersonation determining unit determines that the generated handwriting-information is not stored in the database in association with the name handwritten in the name-field,
extracts a name stored in the database in association with the generated handwriting-information.
6. The information processing apparatus according to claim 5, wherein
the impersonation determining unit,
where the impersonation determining unit determines that the generated handwriting-information is stored in the database in association with the name handwritten in the name-field,
supplies the generated handwriting-information to the database in association with the name handwritten in the name-field.
7. The information processing apparatus according to claim 1, further comprising:
a memory that stores the database.
8. The information processing apparatus according to claim 1, further comprising:
a communication device that obtains the database stored in a memory of an external server apparatus.
9. The information processing apparatus according to claim 1, further comprising:
an image scanner that scans the sheet and obtains the sheet-image.
10. The information processing apparatus according to claim 1, further comprising:
a printer that prints the combined-image.
11. A non-transitory computer readable recording medium that records an information processing program executable by a processor of an information processing apparatus, the information processing program causing the processor of the information processing apparatus to operate as
an image obtaining unit that obtains a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field,
a character recognizing unit that recognizes the handwritten-characters in the sheet-image,
a handwriting-information generating unit that generates handwriting-information indicating characteristics of each character of the recognized handwritten-characters,
a name-field determining unit that determines whether or not a name is written in the name-field in the sheet-image,
a writer determining unit that,
where the name-field determining unit determines that no name is written, extracts, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generates a table, and
selects one name stored in the table in association with the generated handwriting-information, and
an image generating unit that
generates a name-image indicating the selected name, and
combines the name-image and the sheet-image to generate a combined-image.
12. An information processing method, comprising:
obtaining a sheet-image obtained by scanning a sheet including a name-field, in which a name is to be handwritten, and handwritten-characters written in an area other than the name-field;
recognizing the handwritten-characters in the sheet-image;
generating handwriting-information indicating characteristics of each character of the recognized handwritten-characters;
determining whether or not a name is written in the name-field in the sheet-image;
where determining that no name is written,
extracting, from a database that stores one or more persons' names, the persons' attributes, and handwriting-informations of the persons in association with each other, one or more names and handwriting-informations in association with a particular attribute, and generating a table;
selecting one name stored in the table in association with the generated handwriting-information;
generating a name-image indicating the selected name; and
combining the name-image and the sheet-image to generate a combined-image.
US15/976,316 2017-05-15 2018-05-10 Information processing apparatus, non-transitory computer readable recording medium, and information processing method Expired - Fee Related US10607071B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-096180 2017-05-15
JP2017096180A JP6729486B2 (en) 2017-05-15 2017-05-15 Information processing apparatus, information processing program, and information processing method

Publications (2)

Publication Number Publication Date
US20180330155A1 US20180330155A1 (en) 2018-11-15
US10607071B2 true US10607071B2 (en) 2020-03-31

Family

ID=64096643

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/976,316 Expired - Fee Related US10607071B2 (en) 2017-05-15 2018-05-10 Information processing apparatus, non-transitory computer readable recording medium, and information processing method

Country Status (3)

Country Link
US (1) US10607071B2 (en)
JP (1) JP6729486B2 (en)
CN (1) CN108875570B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6870137B1 (en) * 2020-04-06 2021-05-12 株式会社Alconta Data utilization system, data utilization method and program
KR20220169231A (en) * 2021-06-18 2022-12-27 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. Generating file of distinct writer based on handwriting text

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0646217A (en) 1992-07-22 1994-02-18 Ricoh Co Ltd Facsimile equipment
US6668354B1 (en) * 1999-01-05 2003-12-23 International Business Machines Corporation Automatic display script and style sheet generation
US20060197928A1 (en) * 2005-03-01 2006-09-07 Canon Kabushiki Kaisha Image processing apparatus and its method
US20070081179A1 (en) * 2005-10-07 2007-04-12 Hirobumi Nishida Image processing device, image processing method, and computer program product
US7236653B2 (en) * 2003-03-27 2007-06-26 Sharp Laboratories Of America, Inc. System and method for locating document areas using markup symbols
JP2008020506A (en) 2006-07-11 2008-01-31 Fuji Xerox Co Ltd Image processor and image processing program
US20080181501A1 (en) * 2004-07-30 2008-07-31 Hewlett-Packard Development Company, L.P. Methods, Apparatus and Software for Validating Entries Made on a Form
US20090138284A1 (en) * 2007-11-14 2009-05-28 Hybrid Medical Record Systems, Inc. Integrated Record System and Method
US20100179962A1 (en) * 2005-12-15 2010-07-15 Simpliance, Inc. Methods and Systems for Intelligent Form-Filling and Electronic Document Generation
US20100228693A1 (en) * 2009-03-06 2010-09-09 phiScape AG Method and system for generating a document representation
US7802184B1 (en) * 1999-09-28 2010-09-21 Cloanto Corporation Method and apparatus for processing text and character data
US20110271173A1 (en) * 2010-05-03 2011-11-03 Xerox Corporation Method and apparatus for automatic filling of forms with data
US20120087537A1 (en) * 2010-10-12 2012-04-12 Lisong Liu System and methods for reading and managing business card information
US8189920B2 (en) * 2007-01-17 2012-05-29 Kabushiki Kaisha Toshiba Image processing system, image processing method, and image processing program
US20130215474A1 (en) * 2011-11-04 2013-08-22 Document Security Systems, Inc. System and Method for Printing Documents Containing Dynamically Generated Security Features
US8542953B2 (en) * 2004-10-04 2013-09-24 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20140049788A1 (en) * 2009-09-04 2014-02-20 Hirohisa Inamoto Image processing apparatus, image processing system, and image processing method
US20140146200A1 (en) * 2012-11-28 2014-05-29 Research In Motion Limited Entries to an electronic calendar
US8958644B2 (en) * 2013-02-28 2015-02-17 Ricoh Co., Ltd. Creating tables with handwriting images, symbolic representations and media images from forms
US9298685B2 (en) * 2013-02-28 2016-03-29 Ricoh Company, Ltd. Automatic creation of multiple rows in a table
US20160092729A1 (en) * 2014-09-29 2016-03-31 Kabushiki Kaisha Toshiba Information processing device, information processing method, and computer program product
US20160098596A1 (en) * 2014-10-03 2016-04-07 Xerox Corporation Methods and systems for processing documents
US9390089B2 (en) * 2009-12-17 2016-07-12 Wausau Financial Systems, Inc. Distributed capture system for use with a legacy enterprise content management system
US9418315B1 (en) * 2016-03-14 2016-08-16 Sageworks, Inc. Systems, methods, and computer readable media for extracting data from portable document format (PDF) files
JP2016225699A (en) 2015-05-27 2016-12-28 京セラドキュメントソリューションズ株式会社 Image forming apparatus and image forming program
US20170163828A1 (en) * 2015-12-08 2017-06-08 Kyocera Document Solutions Inc. Image reader and image forming apparatus determining direction of document to be read
US20180035007A1 (en) * 2016-07-28 2018-02-01 Kyocera Document Solutions Inc. Image forming apparatus, storage medium, and method for digitizing document
US20180046708A1 (en) * 2016-08-11 2018-02-15 International Business Machines Corporation System and Method for Automatic Detection and Clustering of Articles Using Multimedia Information
US9922400B2 (en) * 2012-03-12 2018-03-20 Canon Kabushiki Kaisha Image display apparatus and image display method
US10013624B2 (en) * 2013-03-15 2018-07-03 A9.Com, Inc. Text entity recognition
US20180301222A1 (en) * 2014-11-03 2018-10-18 Automated Clinical Guidelines, Llc Method and platform/system for creating a web-based form that incorporates an embedded knowledge base, wherein the form provides automatic feedback to a user during and following completion of the form
US20180302227A1 (en) * 2015-04-30 2018-10-18 Bundesdruckerei Gmbh Method for generating an electronic signature
US20190026579A1 (en) * 2017-07-24 2019-01-24 Bank Of America Corporation System for dynamic optical character recognition tuning
US20190188251A1 (en) * 2017-12-14 2019-06-20 International Business Machines Corporation Cognitive auto-fill content recommendation
US20190303662A1 (en) * 2018-03-29 2019-10-03 Fmr Llc Recognition of handwritten characters in digital images using context-based machine learning
US20190303665A1 (en) * 2018-03-30 2019-10-03 AO Kaspersky Lab System and method of identifying an image containing an identification document
US20190311210A1 (en) * 2018-04-05 2019-10-10 Walmart Apollo, Llc Automated extraction of product attributes from images
US20190347480A1 (en) * 2018-05-11 2019-11-14 Kyocera Document Solutions Inc. Image processing apparatus and method for controlling image processing apparatus

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000057415A (en) * 1998-08-11 2000-02-25 Hitachi Ltd Automatic transaction device
JP4280355B2 (en) * 1999-05-06 2009-06-17 富士通株式会社 Character recognition device
JP4807486B2 (en) * 2005-02-23 2011-11-02 富士ゼロックス株式会社 Teaching material processing apparatus, teaching material processing method, and teaching material processing program
JP4861868B2 (en) * 2007-03-19 2012-01-25 株式会社リコー Image processing apparatus, image processing method, image processing program, and recording medium
CN101276412A (en) * 2007-03-30 2008-10-01 夏普株式会社 Information processing system, device and method
CN101872344A (en) * 2009-04-27 2010-10-27 上海百测电气有限公司 Control method for image scanning
JP2013109690A (en) * 2011-11-24 2013-06-06 Oki Electric Ind Co Ltd Business form data input device, and business form data input method
CN102663124A (en) * 2012-04-20 2012-09-12 上海合合信息科技发展有限公司 Method and system for managing contact person information on mobile devices
CN103020619B (en) * 2012-12-05 2016-04-20 上海合合信息科技发展有限公司 A kind of method of handwritten entries in automatic segmentation electronization notebook
JP5795353B2 (en) * 2013-05-31 2015-10-14 京セラドキュメントソリューションズ株式会社 Image forming apparatus, image forming system, and image forming method
JP6000992B2 (en) * 2014-01-24 2016-10-05 京セラドキュメントソリューションズ株式会社 Document file generation apparatus and document file generation method
US9524435B2 (en) * 2015-03-20 2016-12-20 Google Inc. Detecting the location of a mobile device based on semantic indicators

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0646217A (en) 1992-07-22 1994-02-18 Ricoh Co Ltd Facsimile equipment
US6668354B1 (en) * 1999-01-05 2003-12-23 International Business Machines Corporation Automatic display script and style sheet generation
US7802184B1 (en) * 1999-09-28 2010-09-21 Cloanto Corporation Method and apparatus for processing text and character data
US7236653B2 (en) * 2003-03-27 2007-06-26 Sharp Laboratories Of America, Inc. System and method for locating document areas using markup symbols
US20080181501A1 (en) * 2004-07-30 2008-07-31 Hewlett-Packard Development Company, L.P. Methods, Apparatus and Software for Validating Entries Made on a Form
US8542953B2 (en) * 2004-10-04 2013-09-24 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20060197928A1 (en) * 2005-03-01 2006-09-07 Canon Kabushiki Kaisha Image processing apparatus and its method
US20070081179A1 (en) * 2005-10-07 2007-04-12 Hirobumi Nishida Image processing device, image processing method, and computer program product
US20100179962A1 (en) * 2005-12-15 2010-07-15 Simpliance, Inc. Methods and Systems for Intelligent Form-Filling and Electronic Document Generation
JP2008020506A (en) 2006-07-11 2008-01-31 Fuji Xerox Co Ltd Image processor and image processing program
US8189920B2 (en) * 2007-01-17 2012-05-29 Kabushiki Kaisha Toshiba Image processing system, image processing method, and image processing program
US20090138284A1 (en) * 2007-11-14 2009-05-28 Hybrid Medical Record Systems, Inc. Integrated Record System and Method
US20100228693A1 (en) * 2009-03-06 2010-09-09 phiScape AG Method and system for generating a document representation
US9307109B2 (en) * 2009-09-04 2016-04-05 Ricoh Company, Ltd. Image processing apparatus, image processing system, and image processing method
US20140049788A1 (en) * 2009-09-04 2014-02-20 Hirohisa Inamoto Image processing apparatus, image processing system, and image processing method
US9390089B2 (en) * 2009-12-17 2016-07-12 Wausau Financial Systems, Inc. Distributed capture system for use with a legacy enterprise content management system
US20110271173A1 (en) * 2010-05-03 2011-11-03 Xerox Corporation Method and apparatus for automatic filling of forms with data
US20120087537A1 (en) * 2010-10-12 2012-04-12 Lisong Liu System and methods for reading and managing business card information
US20130215474A1 (en) * 2011-11-04 2013-08-22 Document Security Systems, Inc. System and Method for Printing Documents Containing Dynamically Generated Security Features
US9922400B2 (en) * 2012-03-12 2018-03-20 Canon Kabushiki Kaisha Image display apparatus and image display method
US20140146200A1 (en) * 2012-11-28 2014-05-29 Research In Motion Limited Entries to an electronic calendar
US8958644B2 (en) * 2013-02-28 2015-02-17 Ricoh Co., Ltd. Creating tables with handwriting images, symbolic representations and media images from forms
US9298685B2 (en) * 2013-02-28 2016-03-29 Ricoh Company, Ltd. Automatic creation of multiple rows in a table
US10013624B2 (en) * 2013-03-15 2018-07-03 A9.Com, Inc. Text entity recognition
US20160092729A1 (en) * 2014-09-29 2016-03-31 Kabushiki Kaisha Toshiba Information processing device, information processing method, and computer program product
US20160098596A1 (en) * 2014-10-03 2016-04-07 Xerox Corporation Methods and systems for processing documents
US20180301222A1 (en) * 2014-11-03 2018-10-18 Automated Clinical Guidelines, Llc Method and platform/system for creating a web-based form that incorporates an embedded knowledge base, wherein the form provides automatic feedback to a user during and following completion of the form
US20180302227A1 (en) * 2015-04-30 2018-10-18 Bundesdruckerei Gmbh Method for generating an electronic signature
JP2016225699A (en) 2015-05-27 2016-12-28 京セラドキュメントソリューションズ株式会社 Image forming apparatus and image forming program
US20170163828A1 (en) * 2015-12-08 2017-06-08 Kyocera Document Solutions Inc. Image reader and image forming apparatus determining direction of document to be read
US9418315B1 (en) * 2016-03-14 2016-08-16 Sageworks, Inc. Systems, methods, and computer readable media for extracting data from portable document format (PDF) files
US20180035007A1 (en) * 2016-07-28 2018-02-01 Kyocera Document Solutions Inc. Image forming apparatus, storage medium, and method for digitizing document
US20180046708A1 (en) * 2016-08-11 2018-02-15 International Business Machines Corporation System and Method for Automatic Detection and Clustering of Articles Using Multimedia Information
US20190026579A1 (en) * 2017-07-24 2019-01-24 Bank Of America Corporation System for dynamic optical character recognition tuning
US20190188251A1 (en) * 2017-12-14 2019-06-20 International Business Machines Corporation Cognitive auto-fill content recommendation
US20190303662A1 (en) * 2018-03-29 2019-10-03 Fmr Llc Recognition of handwritten characters in digital images using context-based machine learning
US20190303665A1 (en) * 2018-03-30 2019-10-03 AO Kaspersky Lab System and method of identifying an image containing an identification document
US20190311210A1 (en) * 2018-04-05 2019-10-10 Walmart Apollo, Llc Automated extraction of product attributes from images
US20190347480A1 (en) * 2018-05-11 2019-11-14 Kyocera Document Solutions Inc. Image processing apparatus and method for controlling image processing apparatus

Also Published As

Publication number Publication date
CN108875570B (en) 2022-04-19
US20180330155A1 (en) 2018-11-15
JP2018195898A (en) 2018-12-06
CN108875570A (en) 2018-11-23
JP6729486B2 (en) 2020-07-22

Similar Documents

Publication Publication Date Title
US11574489B2 (en) Image processing system, image processing method, and storage medium
US8131081B2 (en) Image processing apparatus, and computer program product
US11042733B2 (en) Information processing apparatus for text recognition, non-transitory computer readable medium for text recognition process and information processing method for text recognition
US20060285748A1 (en) Document processing device
US11341733B2 (en) Method and system for training and using a neural network for image-processing
US9148532B2 (en) Automated user preferences for a document processing unit
US11418658B2 (en) Image processing apparatus, image processing system, image processing method, and storage medium
US20200104586A1 (en) Method and system for manual editing of character recognition results
US20090307264A1 (en) Object acquisition device, object management system, and object management method
US10503993B2 (en) Image processing apparatus
US10607071B2 (en) Information processing apparatus, non-transitory computer readable recording medium, and information processing method
JP2013196479A (en) Information processing system, information processing program, and information processing method
JP6187063B2 (en) Information processing apparatus, information processing system, information processing method, and program
US20150261735A1 (en) Document processing system, document processing apparatus, and document processing method
US20210089804A1 (en) Information processing apparatus and non-transitory computer readable medium
US9152885B2 (en) Image processing apparatus that groups objects within image
US11170253B2 (en) Information processing apparatus and non-transitory computer readable medium
US20200192610A1 (en) Computer-readable storage medium storing a program and input format setting method
JP2017021654A (en) Document management server and system
US9965709B2 (en) Non-transitory computer readable recording medium that records a program for causing a computer of an information processing apparatus to generate printable data utilizing cached commands, and information processing apparatus that generates printable data
JP6561876B2 (en) Information processing apparatus and program
US11659106B2 (en) Information processing apparatus, non-transitory computer readable medium, and character recognition system
US10659654B2 (en) Information processing apparatus for generating an image surrounded by a marking on a document, and non-transitory computer readable recording medium that records an information processing program for generating an image surrounded by a marking on a document
US20220301326A1 (en) Ocr target area position acquisition system, computer-readable non-transitory recording medium storing ocr target area position acquisition program, hard copy, hard copy generation system, and computer-readable non-transitory recording medium storing hard copy generation program
US12100231B2 (en) Information processing apparatus and non-transitory computer readable medium storing program

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: KYOCERA DOCUMENT SOLUTIONS, INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HIRATSUKA, MOTOKI;REEL/FRAME:047817/0065

Effective date: 20180508

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240331