WO2018116422A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2018116422A1
WO2018116422A1 PCT/JP2016/088185 JP2016088185W WO2018116422A1 WO 2018116422 A1 WO2018116422 A1 WO 2018116422A1 JP 2016088185 W JP2016088185 W JP 2016088185W WO 2018116422 A1 WO2018116422 A1 WO 2018116422A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
character
character information
image
control unit
Prior art date
Application number
PCT/JP2016/088185
Other languages
French (fr)
Japanese (ja)
Inventor
順平 大木
Original Assignee
サン電子株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by サン電子株式会社 filed Critical サン電子株式会社
Priority to PCT/JP2016/088185 priority Critical patent/WO2018116422A1/en
Publication of WO2018116422A1 publication Critical patent/WO2018116422A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns

Definitions

  • the technology disclosed in this specification relates to an information processing apparatus that executes processing of character information.
  • Patent Document 1 Japanese Unexamined Patent Application Publication No. 2007-108807 discloses a system including a delivery terminal carried by a delivery person and a Web server that can communicate with the delivery terminal.
  • the delivery terminal When the delivery person inputs the recipient identification information to the delivery terminal when the recipient is absent, the delivery terminal transmits the inputted recipient identification information to the Web server.
  • the Web server uses the recipient identification information to identify the recipient's e-mail address registered in the Web server, and transmits the absent contact information to the identified e-mail address.
  • a delivery person delivers a specific package such as a credit card
  • a specific law related to the specific package The delivery clerk asks the recipient to present his / her ID in accordance with certain laws, performs identity verification to confirm that the recipient is the person who should receive the package, and is confirmed to be the principal
  • deliver a specific package to the recipient At that time, the person in charge of delivery records the character information described in the identification card presented by the recipient.
  • This specification discloses a technique that can appropriately notify the user that the input character information does not match the character information described in the document.
  • the information processing apparatus disclosed in this specification includes an image forming unit, an input unit, a notification unit, and a control unit.
  • the control unit is an image formed by the image forming unit, and is represented by an image acquisition unit that acquires image data indicating an image representing a document in which the first character information is described, and the acquired image data.
  • an extraction unit that extracts first character data representing first character information described in a document in the image, and a second input to the input unit
  • An input data acquisition unit that acquires second character data representing character information; first character information represented by first character data; second character information represented by second character data;
  • a notification control unit that causes the notification unit to perform a notification operation in a specific case where the two do not match.
  • the notification operation is executed.
  • the user of the information processing apparatus can know that the second character information does not match the first character information. Therefore, the information processing apparatus can appropriately notify the user that the input character information does not match the character information described in the document.
  • the “image forming unit” includes a camera, a scanner, and the like.
  • the “input unit” includes a keyboard, a touch panel, and the like.
  • the “notification unit” includes a display, a speaker, and the like, and the “notification operation” includes an operation for displaying a warning screen on the display, an operation for outputting a warning sound from the speaker, and the like.
  • the above-mentioned “document” includes a driver's license, a membership card, an identification card such as a passport, various tickets, etc., and the “first character information” is a unique number written on the identification card, ticket, etc. including.
  • the “information processing apparatus” may be a system including a terminal device and a server capable of communicating with the terminal device, or may be a single terminal device.
  • the “information processing apparatus” is not limited to an apparatus used in a situation where a specific package is delivered, and is an apparatus used in a situation where a document such as an identification card or a ticket is presented from a third party. Also good.
  • the control unit further selects the second character data when the first character information represented by the first character data matches the second character information represented by the second character data.
  • the “storage device” is a device such as a semiconductor memory or a hard disk drive.
  • the device may be provided in the terminal device, or may be connected to the terminal device via a network (for example, a LAN (abbreviation of Local Area Network), the Internet, etc.).
  • the “storage device” may be either a volatile storage device or a nonvolatile storage device. According to said structure, when 1st character information and 2nd character information correspond, 2nd character data can be memorize
  • the information processing apparatus may include a terminal device and a server that can communicate with the terminal device.
  • the terminal device includes an image forming unit, an input unit, a notification unit, and a terminal control unit
  • the terminal control unit includes an image acquisition unit, an extraction unit, an input data acquisition unit, a notification control unit
  • the extraction unit further transmits the extracted first character data to the server
  • the input data acquisition unit further transmits the acquired second character data to the server
  • the notification control unit may perform the notification operation.
  • the server includes a server control unit, the server control unit receiving a first character data from the terminal device, a second receiving unit receiving the second character data from the terminal device, A determination unit that determines whether or not the first character information represented by the received first character data matches the second character information represented by the received second character data; A transmission unit that transmits a notification instruction to the terminal device when it is determined that the first character information and the second character information do not match in the unit.
  • the “server” is communicably connected to the terminal device via a network (for example, LAN, Internet, etc.).
  • a terminal device transmits the extracted 1st character data to a server.
  • the communication load between the terminal device and the server can be reduced as compared with other configurations in which the terminal device transmits image data to the server and causes the server to execute character recognition processing.
  • the “server” may be configured by a single device or may be configured by a plurality of devices that can communicate with each other.
  • the server receives first character data from the terminal device, receives second character data from the terminal device, and first character information represented by the received first character data. And the second character information represented by the received second character data are determined to match, and it is determined that the first character information and the second character information do not match As long as the notification instruction can be transmitted to the terminal device, it may have an arbitrary configuration.
  • the terminal device further includes a frame that can be mounted on the user's head, and the image forming unit includes a camera that is mounted on the frame and can capture a range corresponding to the field of view of the user wearing the frame.
  • the image forming unit includes a camera that is mounted on the frame and can capture a range corresponding to the field of view of the user wearing the frame.
  • the terminal device can acquire image data indicating an image representing the document.
  • control method a control method, a computer program, and a computer-readable recording medium storing the computer program for realizing the information processing apparatus are also novel and useful.
  • FIG. 1 shows a schematic diagram of an information processing system.
  • 1 shows a configuration of an information processing system.
  • the flowchart of the apparatus process which the control part of an image display apparatus performs is shown.
  • the flowchart of the server process which the control part of a server performs is shown.
  • a case A1 where the character information does not match and a case A2 where the character information matches are shown.
  • An information processing system 2 shown in FIG. 1 is a system for processing and managing various information input when a delivery person (that is, a user of the information processing system 2) D1 delivers a credit card to a customer C1.
  • the information processing system 2 includes an image display device 4, a keyboard 8, and a server 100.
  • the image display device 4 and the server 100 can communicate with each other via the Internet 60.
  • the image display device 4 and the keyboard 8 can communicate with each other by short-range wireless communication.
  • the image display device 4 shown in FIG. 1 is a so-called head mounted display that is used by being attached to the head H1 of the delivery person D1.
  • the image display device 4 includes a support 6, display units 10 a and 10 b, projection units 11 a and 11 b, a first camera 12, a second camera 14, and a control box 16.
  • the support 6 is a spectacle frame-shaped member.
  • the delivery person D1 can wear the image display device 4 on the head by wearing the support 6 so as to wear glasses.
  • the display units 10a and 10b are translucent display members, respectively.
  • the display unit 10a When the delivery person D1 wears the image display device 4 on the head, the display unit 10a is arranged at a position facing the right eye of the delivery person D1, and the display unit 10b is arranged at a position facing the left eye.
  • the left and right display units 10a and 10b may be collectively referred to as the display unit 10.
  • the delivery person D1 can visually recognize the surroundings through the display unit 10.
  • Projection units 11a and 11b are members that project images onto the display units 10a and 10b.
  • the projection units 11a and 11b are provided on the side portions of the display units 10a and 10b.
  • the left and right projection units 11a and 11b may be collectively referred to as the projection unit 11.
  • the projection unit 11 projects a predetermined object image on the display unit 10 in accordance with an instruction from the control unit 30.
  • the delivery person D1 can see the real-world object as if the object image was synthesized at a predetermined position in the space or / and the real-world object that the delivery person D1 can visually recognize through the display unit 10. Or / and space and object images can be seen.
  • control unit 30 displays a desired screen on the display unit 10 by instructing the projection unit 11 to project an image
  • the operation of the projection unit 11 will be described. It may be omitted and simply expressed as “the control unit 30 causes the display unit 10 to display a desired image”.
  • the first camera 12 is a camera disposed in the support 6 at a position above the display unit 10a (that is, a position corresponding to the right eye of the delivery person D1).
  • the 2nd camera 14 is a camera arrange
  • the first camera 12 captures a range corresponding to the visual field range of the user's right eye.
  • the second camera 14 captures a range corresponding to the visual field range of the user's left eye.
  • the control box 16 is attached to a part of the support 6.
  • the control box 16 accommodates each element that controls the control system of the image display device 4.
  • the control box 16 stores a network interface 20, a short-range communication interface 22, a control unit 30, and a memory 32.
  • the interface is described as “I / F”.
  • the control box 16 may be provided separately from the support 6.
  • each component (network I / F 20, control unit 30, memory 32) in the control box 16 and each component (projection unit 11, cameras 12, 14) provided on the support 6 are connected by cables or the like. As long as it is electrically connected.
  • each component (network I / F 20, control unit 30, memory 32) in the control box 16 and each component (projection unit 11, camera) provided in the support 6 are performed by wireless communication. 12, 14) may be communicable.
  • the network I / F 20 is an I / F for executing wireless communication with a device (for example, the server 100) on the Internet 60.
  • the short-range communication I / F 22 is an I / F for performing short-range communication with an external device (for example, the keyboard 8) located around the image display device 4.
  • the short-range communication is, for example, Bluetooth communication conforming to the Bluetooth (registered trademark of Bluetooth SIG) standard.
  • the control unit 30 executes various processes according to the main program 34 stored in the memory 32. Details of processing executed by the control unit 30 will be described in detail later. Further, as shown in FIG. 2, the control unit 30 includes a display unit 10, a projection unit 11, a first camera 12, a second camera 14, a network I / F 20, a near field communication I / F 22, and a memory 32. And the operation of each of these elements can be controlled.
  • the memory 32 stores a character recognition program 36 in addition to the main program 34.
  • the character recognition program 36 is a program for executing a character recognition process (so-called OCR (abbreviation of Optical Character ⁇ Recognition)) for images formed by the cameras 12 and 14.
  • OCR abbreviation of Optical Character ⁇ Recognition
  • the character recognition process is a process of analyzing an image, detecting a pattern of character information included in the image, and extracting the character information as character data (for example, a character code such as Unicode).
  • a keyboard 8 shown in FIG. 1 includes a plurality of keys for inputting instructions, character information, and the like to the image display device 4.
  • the keyboard 8 may be a keyboard and a touch panel mounted on a portable terminal such as a tablet terminal or a smartphone.
  • the keyboard 8 may be attached to the support 6 of the image display device 4.
  • the server 100 shown in FIG. 1 is installed on the Internet 60 by the vendor of the image display device 4.
  • the server 100 includes a control unit 130 and a memory 132.
  • the control unit 130 executes various processes according to the program 134 stored in the memory 132.
  • the memory 132 also has an area for storing various information generated in accordance with the processing of the control unit 130 (for example, server processing in FIG. 4 described later).
  • the delivery person D1 When delivering the credit card (not shown), the delivery person D1 requests the customer C1 to present the identification card 50 in order to confirm that the customer C1 is the person who should receive the credit card.
  • the delivery person D1 inputs the identification character information 54 described in the predetermined frame 52 of the presented identification card 50 to the keyboard 8.
  • the image display device 4 transmits input character data representing input character information to the server 100 via the Internet 60.
  • the cameras 12 and 14 take the identification card 50.
  • the image display device 4 acquires an image of the identification card 50 photographed by the cameras 12 and 14.
  • the image display device 4 performs character recognition processing on the photographed image of the identification card 50 and extracts recognized character data representing the identification character information 54 described in the identification card 50.
  • the image display device 4 transmits the recognized character data to the server 100 via the Internet 60.
  • the server 100 determines whether or not the input character data matches the recognized character data. If the input character data does not match the recognized character data, the server 100 transmits a warning instruction for displaying a warning screen on the display unit 10 of the image display device 4 to the image display device 4. Thereby, the delivery person D1 can know that the character information different from the identification character information 54 has been inputted to the keyboard 8 by looking at the warning screen. On the other hand, when the input character data matches the recognized character data, the server 100 stores the input character data.
  • the control unit 30 monitors whether a predetermined input start instruction is input to the keyboard 8 via the short-range communication I / F 22.
  • the input start instruction is an instruction for starting input of character information.
  • the delivery person D1 can input an input start instruction to the keyboard 8.
  • the control unit 30 determines YES in S10 and proceeds to S12.
  • the control unit 30 causes the display unit 10 to display a predetermined character display screen.
  • the character display screen is a screen for displaying character information input to the keyboard 8 (hereinafter referred to as “input character information”).
  • the delivery person D1 inputs character information (that is, identification character information 54 described on the presented identification card 50) to the keyboard 8 while looking at the character display screen.
  • control unit 30 acquires input character data representing input character information input from the keyboard 8 to the keyboard 8 via the short-range communication I / F 22.
  • the control unit 30 acquires image data indicating an image representing the identification card 50.
  • the delivery person D ⁇ b> 1 inputs character information while looking at the identification card 50 through the display unit 10.
  • the cameras 12 and 14 take an image of the identification card 50 existing in the field of view of the delivery person D1.
  • the control unit 30 acquires image data representing an image of the identification card 50 photographed by the cameras 12 and 14.
  • the control unit 30 uses the character recognition program 36 to execute character recognition processing on the image represented by the acquired image data. Specifically, the control unit 30 detects a predetermined frame 52 arranged at a predetermined position according to a predetermined format of the identification card 50 from the image. The control unit 30 performs a character recognition process on the identification character information 54 in the predetermined frame 52. Thereby, the control unit 30 extracts recognized character data representing the identification character information 54 described in the identification card 50.
  • control unit 30 transmits both the input character data acquired in S14 and the recognized character data acquired in S22 to the server 100 via the network I / F 20.
  • the control unit 30 determines whether or not a warning instruction is received from the server 100.
  • the warning instruction is an instruction for causing the display unit 10 to display a warning screen indicating that the input character information represented by the input character data does not match the identification character information 54 represented by the recognized character data.
  • the warning instruction is transmitted from the server 100 to the image display device 4 when the server 100 determines that the input character information represented by the input character data does not match the identification character information 54 represented by the recognized character data. .
  • the control unit 30 determines YES in S22, and proceeds to S24.
  • the control unit 30 causes the display unit 10 to display a predetermined warning screen according to the warning instruction. When S24 ends, the control unit 30 returns to the monitoring of S10.
  • the OK instruction is an instruction for causing the display unit 10 to display an OK screen indicating that the input character information represented by the input character data matches the identification character information 54 represented by the recognized character data.
  • the OK instruction is transmitted from the server 100 to the image display device 4 when the server 100 determines that the input character information represented by the input character data matches the identification character information 54 represented by the recognized character data.
  • the control unit 30 displays a predetermined OK screen on the display unit 10 according to the OK instruction.
  • control unit 130 monitors reception of input character data and recognized character data from the image display device 4.
  • control unit 130 determines YES in S50 and proceeds to S52.
  • control unit 130 compares the input character information represented by the input character data with the identification character information 54 represented by the recognized character data, and determines whether or not they match. If the control unit 130 determines that both match, the control unit 130 determines YES in S52 and proceeds to S54. On the other hand, when determining that the two do not match, the control unit 130 determines NO in S52 and proceeds to S58.
  • control unit 130 stores the input character data representing the input character information in a predetermined storage area of the memory 32.
  • the control unit 130 transmits a predetermined OK instruction to the image display device 4.
  • the OK instruction is an instruction for displaying a predetermined OK screen on the display unit 10 of the image display device 4.
  • the control unit 130 returns to the monitoring of S50.
  • the control unit 130 transmits a predetermined warning instruction to the image display device 4.
  • the warning instruction is an instruction for displaying a predetermined warning screen on the display unit 10 of the image display device 4.
  • the control unit 130 returns to the monitoring of S50.
  • the control unit 130 transmits the input character data stored in the predetermined storage area in S54 to the service server (not shown) of the credit card company at a predetermined timing.
  • the customer C1 orders a credit card
  • the customer C1 transmits the identification character information 54 of the identification card 50 to the credit card company.
  • the credit card company registers the identification character information 54 transmitted from the customer C1 in the service server.
  • the service server receives character data representing character information that matches the registered identification character information 54, the service server notifies the credit card company that the delivered customer C1 is the person who ordered the credit card.
  • Case A1 is a case where character information (that is, input character information and identification character information 54) does not match.
  • the image display device 4 receives input instruction (YES in S10 in FIG. 3), and input character information different from the identification character information “123456789” of the identification card 50 via the keyboard 8. The input of “113456789” is accepted.
  • the image display device 4 acquires input character data representing the input character information “113456789” from the keyboard 8 (S14).
  • the image display device 4 photographs the identification card 50.
  • the image display device 4 acquires image data indicating an image representing the identification card 50 photographed by the cameras 12 and 14 (S16).
  • the image display device 4 executes character recognition processing on the image represented by the acquired image data, and recognizes recognized character data representing the identification character information “123456789” described in the identification card 50. Extract (S18).
  • the image display device 4 transmits both the input character data acquired at T102 and the recognized character data extracted at T114 to the server 100 (S20).
  • the server 100 determines that the input character information “113456789” represented by the input character data does not match the identification character information “123456789” represented by the recognized character data (NO in S52 of FIG. 4). ). In subsequent T134, the server 100 transmits a warning instruction to the image display device 4 (S58).
  • the image display device 4 displays a warning screen on the display unit 10 in accordance with the warning instruction received from the server 100 (YES in S22 of FIG. 3) (S24).
  • the delivery person D1 can know that the input character information (T100) input by the delivery person D1 does not match the identification character information 54 by looking at the warning screen displayed on the display unit 10.
  • Case A2 is a case where character information (that is, input character information and identification character information 54) matches.
  • the image display device 4 accepts the input start instruction (YES in S10 of FIG. 3), and the input character information “123456789” that is the same as the identification character information “123456789” of the identification card 50 is received via the keyboard 8. "123456789” is received.
  • the image display device 4 acquires input character data representing the input character information “123456789” from the keyboard 8 (S14).
  • T210 to T220 are the same as T110 to T120.
  • the server 100 determines that the input character information “123456789” represented by the input character data matches the identification character information “123456789” represented by the recognized character data (YES in S52 of FIG. 4). .
  • the server 100 stores the input character data in a predetermined storage area of the memory 32 (S54).
  • the server 100 transmits an OK instruction to the image display device 4 (S56).
  • the image display device 4 displays an OK screen on the display unit 10 in accordance with the OK instruction received from the server 100 (NO in S22 of FIG. 3) (S26).
  • the delivery person D1 can know that the input character information (T200) input by the delivery person D1 matches the identification character information 54 by looking at the OK screen displayed on the display unit 10.
  • the server 100 transmits the input character data stored in the predetermined storage area of the memory 32 to the service server of the credit card company.
  • the identification character information 54 represented by the recognized character data extracted from the image by the character recognition process does not match the input character information input to the keyboard 8.
  • a warning screen is displayed on the display unit 10 of the image display device 4 (T140). Accordingly, the delivery person D1 sees the warning screen and knows that the input character information (T100) input by the delivery person D1 on the keyboard 8 does not match the identification character information 54 described in the identification card 50. Can do.
  • the information processing system 2 stores the input character data representing the input character information in the memory 32 of the server 100 when the identification character information 54 matches the input character information (T230) (T232). .
  • the information processing system 2 can store correct character data indicating the identification character information 54.
  • the image display device 4 executes character recognition processing (S18 in FIG. 3), and transmits the recognized character data extracted by the character recognition processing to the server 100 (S20). Therefore, the image display device 4 and the server 100 are compared with other configurations in which the image display device transmits image data indicating an image captured by the camera to the server 100 and causes the server 100 to perform character recognition processing. The communication load during the period can be reduced.
  • the image display device 4 is a head mounted display that can be mounted on the user's head. Therefore, if the delivery person D1 sees the identification card 50 with the image display device 4 attached (that is, enters the field of view), the image display device 4 acquires image data indicating an image representing the identification card 50. (S16 in FIG. 3).
  • the information processing system 2 is an example of an “information processing apparatus”.
  • a combination of the image display device 4 and the keyboard 8 is an example of a “terminal device”.
  • the cameras 12 and 14, the keyboard 8, and the display unit 10 are examples of “image forming unit”, “input unit”, and “notification unit”, respectively.
  • the identification card 50, the identification character information 54, and the recognized character data are examples of “document”, “first character information”, and “first character data”, respectively.
  • the input character information and the input character data are examples of “second character information” and “second character data”, respectively.
  • Displaying a warning screen on the display unit 10 (S24 in FIG. 3) is an example of “notification operation”.
  • the memory 132 of the server 100 is an example of a “storage device”.
  • the warning instruction in S40 of FIG. 3 is an example of “notification instruction”.
  • the server 100 may transmit the input character data to the service server of the credit card company without storing the input character data in the memory 132 in S54 of FIG.
  • the “storage control unit” can be omitted.
  • the information processing system 2 includes only the image display device and does not need to include the server 100. That is, in this modification, the image display device 4 may exhibit both functions of the image display device 4 and the server 100 of the above-described embodiment. In this case, the control unit 30 may execute the process of S52 of FIG. 4 instead of S20 of FIG. When determining that the input character information and the identification character information 54 match, the control unit 30 displays an OK screen on the display unit 10, and when determining that the input character information and the identification character information 54 do not match, the display unit 10 may display a warning screen.
  • the image display device 4 is an example of an “information processing device”, and the “first reception unit”, “second reception unit”, “determination unit”, and “transmission unit” can be omitted. is there.
  • the control unit 30 performs the processing of S10 to S14 (acquisition of input character data) after executing the processing of S16 and S18 (that is, acquisition of image data and recognized character data). ) May be executed. That is, T100 and T102 may be executed after executing T110 to T114 of FIG. Further, instead of transmitting the input character data and the recognized character data in S20, the control unit 30 may transmit the input character data after S14 and transmit the recognized character data after S18. That is, the control unit 30 may transmit the input character data and the recognized character data at different timings.
  • the “terminal device” may be a mobile terminal such as a smartphone or a tablet terminal provided with a camera and a display device.
  • the terminal device may include a scanner instead of the camera.
  • the control unit of the terminal device may acquire image data of an image representing the identification card by causing the scanner to scan the identification card.
  • the scanner is an example of an “image forming unit”.
  • each of the image display devices 4 has a substantially glasses-like support frame, and can be worn on the user's head like wearing glasses.
  • the image display device is not limited to this, and may have an arbitrary support frame such as a hat shape or a helmet shape as long as the image display device can be mounted on the user's head.
  • the image display device 4 includes a first camera 12 and a second camera in eyewear (glasses, sunglasses, etc.) generally used for purposes such as vision correction and eye protection. 14 and the control box 16 may be attached. In that case, the lens portion of the eyewear may be used as the display unit.
  • the display unit 10 of the image display device 4 may be a light-shielding display, and may block the user's field of view when the user wears the image display device. Since the display unit 10 is a light-shielding display, when the image display device 4 is turned on, the control unit 30 displays an image captured by the first camera 12 in a region facing the user's right eye. The image captured by the second camera 14 may be displayed in a region facing the user's left eye.
  • Modification 8 In the above embodiment, the information processing system 2 has been described as being used in a situation where a credit card is delivered.
  • the technology disclosed in the present specification can be used in other situations that require presentation of an identification card. For example, a situation where a government office issues a certificate to residents, a situation where a bank opens an account for an individual, and a situation where a parcel recipient cannot receive a parcel at the delivery company's window.

Abstract

This information processing device is provided with an image forming unit, an input unit, a notification unit, and a control unit. The control unit is provided with: an image acquisition unit which acquires image data representing an image that is formed by the image forming unit and that represents a document that includes first character information; an extraction unit which, by performing character recognition processing on the image represented by the acquired image data, extracts first character data representing the first character information of the document in the image; an input data acquisition unit which acquires second character data representing second character information inputted to the input unit; and a notification control unit which causes the notification unit to perform a notification operation in a specific case in which the first character information represented by the first character data and the second character information represented by the second character data do not match.

Description

情報処理装置Information processing device
 本明細書によって開示される技術は、文字情報の処理を実行する情報処理装置に関する。 The technology disclosed in this specification relates to an information processing apparatus that executes processing of character information.
 例えば、特開2007-108807号公報(以下、特許文献1という)には、配達担当者が携帯する配達端末と、当該配達端末と通信可能なWebサーバと、を備えるシステムが開示されている。受取主が不在の場合に、配達担当者が配達端末に受取主特定情報を入力すると、配達端末は、入力された受取主特定情報をWebサーバに送信する。Webサーバは、受取主特定情報を利用して、Webサーバに登録されている受取主の電子メールアドレスを特定し、特定された電子メールアドレス宛に不在連絡情報を送信する。 For example, Japanese Unexamined Patent Application Publication No. 2007-108807 (hereinafter referred to as Patent Document 1) discloses a system including a delivery terminal carried by a delivery person and a Web server that can communicate with the delivery terminal. When the delivery person inputs the recipient identification information to the delivery terminal when the recipient is absent, the delivery terminal transmits the inputted recipient identification information to the Web server. The Web server uses the recipient identification information to identify the recipient's e-mail address registered in the Web server, and transmits the absent contact information to the identified e-mail address.
 配達担当者は、クレジットカード等の特定の荷物を配達する際には、特定の荷物に関係する特定の法律を遵守する必要がある。配達担当者は、特定の法律に従って、受取主に身分証明書の提示を求め、受取主が荷物を受け取るべき本人であることを確認するため本人確認を実行し、本人であることが確認された場合に特定の荷物を受取主に引き渡す。その際、配達担当者は、受取主が提示した身分証明書に記載された文字情報を記録する。 When a delivery person delivers a specific package such as a credit card, it is necessary to comply with a specific law related to the specific package. The delivery clerk asks the recipient to present his / her ID in accordance with certain laws, performs identity verification to confirm that the recipient is the person who should receive the package, and is confirmed to be the principal In some cases, deliver a specific package to the recipient. At that time, the person in charge of delivery records the character information described in the identification card presented by the recipient.
 このような状況において、特許文献1の配達端末とWebサーバを利用することが想定される。その場合、配達担当者が、受取主の身分証明書に記載されている文字情報を配達端末に入力すると、配達端末は入力済みの文字情報を表わす文字データをWebサーバに送信する。そして、Webサーバは、受信済みの文字データを記憶する。しかしながら、配達担当者が、身分証明書に記載されている文字情報を配達端末に誤入力する事態が生じる可能性がある。特許文献1の技術では、配達担当者が文字情報を誤入力する事態については考慮されていない。 In such a situation, it is assumed that the delivery terminal and Web server of Patent Document 1 are used. In this case, when the person in charge of delivery inputs the character information described in the identification card of the recipient to the delivery terminal, the delivery terminal transmits character data representing the entered character information to the Web server. The Web server stores the received character data. However, there is a possibility that the person in charge of delivery erroneously inputs the character information described in the identification card to the delivery terminal. In the technique of Patent Document 1, a situation in which a delivery person erroneously inputs character information is not considered.
 本明細書では、入力される文字情報と文書に記載されている文字情報とが一致しないことをユーザに適切に知らせ得る技術を開示する。 This specification discloses a technique that can appropriately notify the user that the input character information does not match the character information described in the document.
 本明細書によって開示される情報処理装置は、画像形成部と、入力部と、報知部と、制御部と、を備える。制御部は、画像形成部によって形成される画像であって、第1の文字情報が記載された文書を表わす画像を示す画像データを取得する画像取得部と、取得済みの画像データによって表される画像に対して文字認識処理を実行することにより、画像中の文書に記載されている第1の文字情報を表わす第1の文字データを抽出する抽出部と、入力部に入力される第2の文字情報を表わす第2の文字データを取得する入力データ取得部と、第1の文字データによって表される第1の文字情報と、第2の文字データによって表される第2の文字情報と、が一致しない特定の場合に、報知部に報知動作を実行させる報知制御部と、を備える。 The information processing apparatus disclosed in this specification includes an image forming unit, an input unit, a notification unit, and a control unit. The control unit is an image formed by the image forming unit, and is represented by an image acquisition unit that acquires image data indicating an image representing a document in which the first character information is described, and the acquired image data. By performing character recognition processing on the image, an extraction unit that extracts first character data representing first character information described in a document in the image, and a second input to the input unit An input data acquisition unit that acquires second character data representing character information; first character information represented by first character data; second character information represented by second character data; A notification control unit that causes the notification unit to perform a notification operation in a specific case where the two do not match.
 上記の情報処理装置によると、文字認識処理により画像から抽出された第1の文字データによって表される第1の文字情報と、入力部に入力される第2の文字情報と、が一致しない特定の場合に、報知動作が実行される。報知動作が行われることにより、情報処理装置のユーザは、第2の文字情報が、第1の文字情報と一致しないことを知り得る。従って、上記の情報処理装置は、入力される文字情報と文書に記載されている文字情報とが一致しないことをユーザに適切に知らせ得る。 According to the information processing apparatus described above, the first character information represented by the first character data extracted from the image by the character recognition process and the second character information input to the input unit do not match In this case, the notification operation is executed. By performing the notification operation, the user of the information processing apparatus can know that the second character information does not match the first character information. Therefore, the information processing apparatus can appropriately notify the user that the input character information does not match the character information described in the document.
 ここで、上記の「画像形成部」は、カメラ、スキャナ等を含む。上記の「入力部」は、キーボード、タッチパネル等を含む。上記の「報知部」は、ディスプレイ、スピーカ等を含み、「報知動作」は、ディスプレイに警告画面を表示する動作、スピーカから警告音を出力する動作等を含む。上記の「文書」は、運転免許証、会員証、パスポート等の身分証明書、各種チケット等を含み、「第1の文字情報」は、身分証明書、チケット等に記載されているユニークな番号を含む。上記の「情報処理装置」は、端末装置と、当該端末装置と通信可能なサーバと、を含むシステムでもよく、端末装置単体でもよい。また、「情報処理装置」は、特定の荷物を配達する状況で利用される装置に限らず、第三者から身分証明書、チケット等といった文書を提示される状況において利用される装置であってもよい。 Here, the “image forming unit” includes a camera, a scanner, and the like. The “input unit” includes a keyboard, a touch panel, and the like. The “notification unit” includes a display, a speaker, and the like, and the “notification operation” includes an operation for displaying a warning screen on the display, an operation for outputting a warning sound from the speaker, and the like. The above-mentioned “document” includes a driver's license, a membership card, an identification card such as a passport, various tickets, etc., and the “first character information” is a unique number written on the identification card, ticket, etc. including. The “information processing apparatus” may be a system including a terminal device and a server capable of communicating with the terminal device, or may be a single terminal device. The “information processing apparatus” is not limited to an apparatus used in a situation where a specific package is delivered, and is an apparatus used in a situation where a document such as an identification card or a ticket is presented from a third party. Also good.
 制御部は、さらに、第1の文字データによって表される第1の文字情報と、第2の文字データによって表される第2の文字情報と、が一致する場合に、第2の文字データを記憶装置に記憶させる記憶制御部を備えてもよい。 The control unit further selects the second character data when the first character information represented by the first character data matches the second character information represented by the second character data. You may provide the memory | storage control part memorize | stored in a memory | storage device.
 ここで、「記憶装置」は、例えば、半導体メモリ、ハードディスクドライブ等の装置である。当該装置は、端末装置に備えられていてもよいし、端末装置とネットワーク(例えば、LAN(Local Area Networkの略)、インターネット等)を介して接続されていてもよい。また、上記の「記憶装置」は、揮発性の記憶装置と、不揮発性の記憶装置とのどちらであってもよい。上記の構成によると、第1の文字情報と第2の文字情報とが一致する場合に、第2の文字データを記憶装置に記憶させることができる。 Here, the “storage device” is a device such as a semiconductor memory or a hard disk drive. The device may be provided in the terminal device, or may be connected to the terminal device via a network (for example, a LAN (abbreviation of Local Area Network), the Internet, etc.). The “storage device” may be either a volatile storage device or a nonvolatile storage device. According to said structure, when 1st character information and 2nd character information correspond, 2nd character data can be memorize | stored in a memory | storage device.
 情報処理装置は、端末装置と、前記端末装置と通信可能なサーバと、を含んでもよい。端末装置は、画像形成部と、入力部と、報知部と、端末制御部と、を備え、端末制御部は、画像取得部と、抽出部と、入力データ取得部と、報知制御部と、を備えており、抽出部は、さらに、抽出された第1の文字データをサーバに送信し、入力データ取得部は、さらに、取得された第2の文字データをサーバに送信し、報知制御部は、第1の文字情報と第2の文字情報とが一致せず、かつ、サーバから報知指示を受信する特定の場合に、報知部に前記報知動作を実行させてもよい。サーバは、サーバ制御部を備え、サーバ制御部は、端末装置から第1の文字データを受信する第1の受信部と、端末装置から第2の文字データを受信する第2の受信部と、受信済みの第1の文字データによって表される第1の文字情報と受信済みの第2の文字データによって表される第2の文字情報とが一致するか否かを判断する判断部と、判断部において第1の文字情報と第2の文字情報とが一致しないと判断される場合に、報知指示を端末装置に送信する、送信部と、を備えてもよい。 The information processing apparatus may include a terminal device and a server that can communicate with the terminal device. The terminal device includes an image forming unit, an input unit, a notification unit, and a terminal control unit, and the terminal control unit includes an image acquisition unit, an extraction unit, an input data acquisition unit, a notification control unit, The extraction unit further transmits the extracted first character data to the server, the input data acquisition unit further transmits the acquired second character data to the server, and the notification control unit In the specific case where the first character information and the second character information do not match and the notification instruction is received from the server, the notification unit may perform the notification operation. The server includes a server control unit, the server control unit receiving a first character data from the terminal device, a second receiving unit receiving the second character data from the terminal device, A determination unit that determines whether or not the first character information represented by the received first character data matches the second character information represented by the received second character data; A transmission unit that transmits a notification instruction to the terminal device when it is determined that the first character information and the second character information do not match in the unit.
 ここで、「サーバ」は、端末装置とネットワーク(例えば、LAN、インターネット等)を介して通信可能に接続されている。上記の構成によると、端末装置は、抽出済みの第1の文字データをサーバに送信する。このため、端末装置がサーバに画像データを送信し、サーバに文字認識処理を実行させる他の構成と比較して、端末装置とサーバとの間の通信負荷を小さくすることができる。また、「サーバ」は、1個の装置によって構成されていても、複数の互いに通信可能な装置によって構成されていてもよい。一般的に言うと、サーバは、端末装置から第1の文字データを受信し、端末装置から第2の文字データを受信し、受信済みの第1の文字データによって表される第1の文字情報と受信済みの第2の文字データによって表される第2の文字情報とが一致するか否かを判断し、第1の文字情報と第2の文字情報とが一致しないと判断される場合に、報知指示を端末装置に送信可能であれば、任意の構成を有していてもよい。 Here, the “server” is communicably connected to the terminal device via a network (for example, LAN, Internet, etc.). According to said structure, a terminal device transmits the extracted 1st character data to a server. For this reason, the communication load between the terminal device and the server can be reduced as compared with other configurations in which the terminal device transmits image data to the server and causes the server to execute character recognition processing. In addition, the “server” may be configured by a single device or may be configured by a plurality of devices that can communicate with each other. Generally speaking, the server receives first character data from the terminal device, receives second character data from the terminal device, and first character information represented by the received first character data. And the second character information represented by the received second character data are determined to match, and it is determined that the first character information and the second character information do not match As long as the notification instruction can be transmitted to the terminal device, it may have an arbitrary configuration.
 端末装置は、さらに、ユーザの頭部に装着可能なフレームを備え、画像形成部は、フレームに搭載され、フレームを装着したユーザの視界範囲に対応する範囲を撮影可能なカメラを含み、報知部は、フレームに搭載され、フレームを装着したユーザの右眼と左眼の少なくとも一方に対向する位置に配置される表示装置を含み、報知動作は、表示装置に、第1の文字情報と第2の文字情報とが一致しないことに関係する報知情報を表示させることを含んでもよい。 The terminal device further includes a frame that can be mounted on the user's head, and the image forming unit includes a camera that is mounted on the frame and can capture a range corresponding to the field of view of the user wearing the frame. Includes a display device mounted on the frame and disposed at a position facing at least one of the right eye and the left eye of the user wearing the frame, and the notification operation is performed on the display device with the first character information and the second character information. It may also include displaying notification information related to the fact that the character information does not match.
 この構成によると、ユーザが端末装置を頭部に装着した状態で文書を見れば(即ち視界範囲に入れれば)、端末装置は、文書を表わす画像を示す画像データを取得することができる。 According to this configuration, if the user views the document with the terminal device mounted on the head (that is, enters the view range), the terminal device can acquire image data indicating an image representing the document.
 なお、上記の情報処理装置を実現するための制御方法、コンピュータプログラム、及び、当該コンピュータプログラムを格納するコンピュータ読取可能記録媒体も、新規で有用である。 Note that a control method, a computer program, and a computer-readable recording medium storing the computer program for realizing the information processing apparatus are also novel and useful.
情報処理システムの概略図を示す。1 shows a schematic diagram of an information processing system. 情報処理システムの構成を示す。1 shows a configuration of an information processing system. 画像表示装置の制御部が実行する装置処理のフローチャートを示す。The flowchart of the apparatus process which the control part of an image display apparatus performs is shown. サーバの制御部が実行するサーバ処理のフローチャートを示す。The flowchart of the server process which the control part of a server performs is shown. 文字情報が一致しないケースA1と文字情報が一致するケースA2を示す。A case A1 where the character information does not match and a case A2 where the character information matches are shown.
(実施例)
(情報処理システム2の構成;図1、図2)
 図1に示される情報処理システム2は、配達人(即ち、情報処理システム2のユーザ)D1が客C1にクレジットカードを配達する場合に入力される各種情報を処理及び管理するためのシステムである。情報処理システム2は、画像表示装置4と、キーボード8と、サーバ100と、を備える。画像表示装置4とサーバ100とはインターネット60を介して相互に通信可能である。また、画像表示装置4とキーボード8とは近距離無線通信によって相互に通信可能である。
(Example)
(Configuration of information processing system 2; FIGS. 1 and 2)
An information processing system 2 shown in FIG. 1 is a system for processing and managing various information input when a delivery person (that is, a user of the information processing system 2) D1 delivers a credit card to a customer C1. . The information processing system 2 includes an image display device 4, a keyboard 8, and a server 100. The image display device 4 and the server 100 can communicate with each other via the Internet 60. The image display device 4 and the keyboard 8 can communicate with each other by short-range wireless communication.
(画像表示装置4の構成;図1、図2)
 図1に示す画像表示装置4は、配達人D1の頭部H1に装着して用いられるいわゆるヘッドマウントディスプレイである。画像表示装置4は、支持体6と、表示部10a、10bと、投影部11a、11bと、第1のカメラ12と、第2のカメラ14と、コントロールボックス16とを備えている。
(Configuration of image display device 4; FIGS. 1 and 2)
The image display device 4 shown in FIG. 1 is a so-called head mounted display that is used by being attached to the head H1 of the delivery person D1. The image display device 4 includes a support 6, display units 10 a and 10 b, projection units 11 a and 11 b, a first camera 12, a second camera 14, and a control box 16.
 支持体6は、眼鏡フレーム状の部材である。配達人D1は、眼鏡を装着するように支持体6を装着することによって、画像表示装置4を頭部に装着することができる。 The support 6 is a spectacle frame-shaped member. The delivery person D1 can wear the image display device 4 on the head by wearing the support 6 so as to wear glasses.
 表示部10a、10bは、それぞれ、透光性の表示部材である。配達人D1が画像表示装置4を頭部に装着すると、配達人D1の右眼に対向する位置に表示部10aが配置され、左眼に対向する位置に表示部10bが配置される。以下、左右の表示部10a、10bを総称して表示部10と呼ぶ場合がある。本実施例では、配達人D1は、表示部10越しに周囲を視認することができる。 The display units 10a and 10b are translucent display members, respectively. When the delivery person D1 wears the image display device 4 on the head, the display unit 10a is arranged at a position facing the right eye of the delivery person D1, and the display unit 10b is arranged at a position facing the left eye. Hereinafter, the left and right display units 10a and 10b may be collectively referred to as the display unit 10. In the present embodiment, the delivery person D1 can visually recognize the surroundings through the display unit 10.
 投影部11a、11bは、表示部10a、10bに画像を投影する部材である。投影部11a、11bは、表示部10a、10bの側部に設けられている。以下、左右の投影部11a、11bを総称して投影部11と呼ぶ場合がある。本実施例では、投影部11は、制御部30の指示に従って、所定のオブジェクト画像を表示部10に投影する。これにより、配達人D1は、表示部10越しに配達人D1が視認可能な現実世界の対象物又は/及び空間の所定の位置に、上記オブジェクト画像が合成されたかの様に、現実世界の対象物又は/及び空間とオブジェクト画像とを見ることができる。以下、本明細書では、制御部30が、投影部11に画像の投影を指示することによって、表示部10に所望の画面を表示させることを説明する場合に、投影部11の動作の説明を省略し、単に「制御部30が表示部10に所望の画像を表示させる」などと表現する場合がある。 Projection units 11a and 11b are members that project images onto the display units 10a and 10b. The projection units 11a and 11b are provided on the side portions of the display units 10a and 10b. Hereinafter, the left and right projection units 11a and 11b may be collectively referred to as the projection unit 11. In the present embodiment, the projection unit 11 projects a predetermined object image on the display unit 10 in accordance with an instruction from the control unit 30. As a result, the delivery person D1 can see the real-world object as if the object image was synthesized at a predetermined position in the space or / and the real-world object that the delivery person D1 can visually recognize through the display unit 10. Or / and space and object images can be seen. Hereinafter, in this specification, when it is explained that the control unit 30 displays a desired screen on the display unit 10 by instructing the projection unit 11 to project an image, the operation of the projection unit 11 will be described. It may be omitted and simply expressed as “the control unit 30 causes the display unit 10 to display a desired image”.
 第1のカメラ12は、支持体6のうち、表示部10aの上方位置(即ち、配達人D1の右眼に対応する位置)に配置されているカメラである。一方、第2のカメラ14は、支持体6のうち、表示部10bの上方位置(即ち、配達人D1の左眼に対応する位置)に配置されているカメラである。第1のカメラ12は、ユーザの右眼の視界範囲に対応する範囲を撮影する。第2のカメラ14は、ユーザの左眼の視界範囲に対応する範囲を撮影する。 The first camera 12 is a camera disposed in the support 6 at a position above the display unit 10a (that is, a position corresponding to the right eye of the delivery person D1). On the other hand, the 2nd camera 14 is a camera arrange | positioned among the support bodies 6 in the upper position of the display part 10b (namely, position corresponding to the left eye of the delivery person D1). The first camera 12 captures a range corresponding to the visual field range of the user's right eye. The second camera 14 captures a range corresponding to the visual field range of the user's left eye.
 コントロールボックス16は、支持体6の一部に取り付けられている。コントロールボックス16には、画像表示装置4の制御系統を司る各要素が収容されている。具体的に言うと、図2に示すように、コントロールボックス16には、ネットワークインターフェース20と、近距離通信インターフェース22と、制御部30と、メモリ32とが収納されている。以下では、インターフェースのことを「I/F」と記載する。他の例では、コントロールボックス16は、支持体6とは別個に備えられていてもよい。その場合、コントロールボックス16内の各構成要素(ネットワークI/F20、制御部30、メモリ32)と、支持体6に備え付けられた各構成要素(投影部11、カメラ12、14)とがケーブル等によって電気的に接続されていればよい。なお、ケーブルの代わりに、無線通信によって、コントロールボックス16内の各構成要素(ネットワークI/F20、制御部30、メモリ32)と、支持体6に備え付けられた各構成要素(投影部11、カメラ12、14)とが通信可能であってもよい。 The control box 16 is attached to a part of the support 6. The control box 16 accommodates each element that controls the control system of the image display device 4. Specifically, as shown in FIG. 2, the control box 16 stores a network interface 20, a short-range communication interface 22, a control unit 30, and a memory 32. Hereinafter, the interface is described as “I / F”. In another example, the control box 16 may be provided separately from the support 6. In that case, each component (network I / F 20, control unit 30, memory 32) in the control box 16 and each component (projection unit 11, cameras 12, 14) provided on the support 6 are connected by cables or the like. As long as it is electrically connected. In addition, instead of the cable, each component (network I / F 20, control unit 30, memory 32) in the control box 16 and each component (projection unit 11, camera) provided in the support 6 are performed by wireless communication. 12, 14) may be communicable.
 ネットワークI/F20は、インターネット60上の装置(例えば、サーバ100)との無線通信を実行するためのI/Fである。また、近距離通信I/F22は、画像表示装置4の周囲に位置する外部の装置(例えば、キーボード8)と近距離通信を実行するためのI/Fである。近距離通信は、例えば、Bluetooth(Bluetooth SIGの登録商標)規格に準拠したBluetooth通信である。 The network I / F 20 is an I / F for executing wireless communication with a device (for example, the server 100) on the Internet 60. The short-range communication I / F 22 is an I / F for performing short-range communication with an external device (for example, the keyboard 8) located around the image display device 4. The short-range communication is, for example, Bluetooth communication conforming to the Bluetooth (registered trademark of Bluetooth SIG) standard.
 制御部30は、メモリ32に記憶されているメインプログラム34に従って様々な処理を実行する。制御部30が実行する処理の内容は後で詳しく説明する。また、制御部30は、図2に示すように、表示部10、投影部11、第1のカメラ12、第2のカメラ14、ネットワークI/F20、近距離通信I/F22、及び、メモリ32と電気的に接続されており、これらの各要素の動作を制御することができる。 The control unit 30 executes various processes according to the main program 34 stored in the memory 32. Details of processing executed by the control unit 30 will be described in detail later. Further, as shown in FIG. 2, the control unit 30 includes a display unit 10, a projection unit 11, a first camera 12, a second camera 14, a network I / F 20, a near field communication I / F 22, and a memory 32. And the operation of each of these elements can be controlled.
 メモリ32は、メインプログラム34の他に、文字認識プログラム36を記憶している。文字認識プログラム36は、カメラ12、14が形成した画像に対する文字認識処理(いわゆる、OCR(Optical Character Recognitionの略))を実行するためのプログラムである。文字認識処理は、画像を解析して、当該画像に含まれる文字情報のパターンを検出し、文字情報を文字データ(例えば、Unicode等の文字コード)として抽出する処理である。 The memory 32 stores a character recognition program 36 in addition to the main program 34. The character recognition program 36 is a program for executing a character recognition process (so-called OCR (abbreviation of Optical Character 画像 Recognition)) for images formed by the cameras 12 and 14. The character recognition process is a process of analyzing an image, detecting a pattern of character information included in the image, and extracting the character information as character data (for example, a character code such as Unicode).
(キーボード8の構成;図1、図2)
 図1に示すキーボード8は、画像表示装置4に対する指示、文字情報等を入力するための複数個のキーを備える。他の例では、キーボード8は、タブレット端末、スマートフォン等の携帯端末に搭載されているキーボード及びタッチパネルでもよい。また、他の例では、キーボード8は、画像表示装置4の支持体6に取り付けられていてもよい。
(Configuration of keyboard 8; FIGS. 1 and 2)
A keyboard 8 shown in FIG. 1 includes a plurality of keys for inputting instructions, character information, and the like to the image display device 4. In another example, the keyboard 8 may be a keyboard and a touch panel mounted on a portable terminal such as a tablet terminal or a smartphone. In another example, the keyboard 8 may be attached to the support 6 of the image display device 4.
(サーバ100の構成;図1、図2)
 図1に示すサーバ100は、画像表示装置4のベンダによりインターネット60上に設置される。サーバ100は、制御部130とメモリ132とを備えている。制御部130は、メモリ132に記憶されているプログラム134に従って様々な処理を実行する。メモリ132は、制御部130の処理(例えば、後述の図4のサーバ処理)に伴って生成される様々な情報を記憶する領域も有している。
(Configuration of server 100; FIGS. 1 and 2)
The server 100 shown in FIG. 1 is installed on the Internet 60 by the vendor of the image display device 4. The server 100 includes a control unit 130 and a memory 132. The control unit 130 executes various processes according to the program 134 stored in the memory 132. The memory 132 also has an area for storing various information generated in accordance with the processing of the control unit 130 (for example, server processing in FIG. 4 described later).
(情報処理システム2が使用される状況;図1)
 ここで、本実施例の情報処理システム2が使用される状況について説明する。配達人D1は、クレジットカード(不図示)を配達する際に、客C1がクレジットカードを受け取るべき本人であることを確認するために、客C1に身分証明書50の提示を求める。配達人D1は、提示された身分証明書50の所定の枠52に記載されている識別文字情報54をキーボード8に入力する。これにより、画像表示装置4は、インターネット60を介して、入力済みの文字情報を表わす入力文字データをサーバ100に送信する。
(Situation in which the information processing system 2 is used; FIG. 1)
Here, a situation where the information processing system 2 of the present embodiment is used will be described. When delivering the credit card (not shown), the delivery person D1 requests the customer C1 to present the identification card 50 in order to confirm that the customer C1 is the person who should receive the credit card. The delivery person D1 inputs the identification character information 54 described in the predetermined frame 52 of the presented identification card 50 to the keyboard 8. As a result, the image display device 4 transmits input character data representing input character information to the server 100 via the Internet 60.
 一方、配達人D1が身分証明書50を画像表示装置4の表示部10越しに見ると、カメラ12、14が身分証明書50を撮影する。これにより、画像表示装置4は、カメラ12、14によって撮影された身分証明書50の画像を取得する。次いで、画像表示装置4は、撮影された身分証明書50の画像に対して文字認識処理を実行し、身分証明書50に記載されている識別文字情報54を表わす認識文字データを抽出する。画像表示装置4は、インターネット60を介して、認識文字データをサーバ100に送信する。 On the other hand, when the delivery person D1 views the identification card 50 through the display unit 10 of the image display device 4, the cameras 12 and 14 take the identification card 50. As a result, the image display device 4 acquires an image of the identification card 50 photographed by the cameras 12 and 14. Next, the image display device 4 performs character recognition processing on the photographed image of the identification card 50 and extracts recognized character data representing the identification character information 54 described in the identification card 50. The image display device 4 transmits the recognized character data to the server 100 via the Internet 60.
 サーバ100は、入力文字データと認識文字データが一致するか否かを判断する。入力文字データと認識文字データが一致しない場合には、サーバ100は、画像表示装置4の表示部10に警告画面を表示させるための警告指示を画像表示装置4に送信する。これにより、配達人D1は、警告画面を見て、キーボード8に識別文字情報54とは異なる文字情報を誤って入力したことを知ることができる。一方、入力文字データと認識文字データが一致する場合には、サーバ100は、入力文字データを記憶する。 The server 100 determines whether or not the input character data matches the recognized character data. If the input character data does not match the recognized character data, the server 100 transmits a warning instruction for displaying a warning screen on the display unit 10 of the image display device 4 to the image display device 4. Thereby, the delivery person D1 can know that the character information different from the identification character information 54 has been inputted to the keyboard 8 by looking at the warning screen. On the other hand, when the input character data matches the recognized character data, the server 100 stores the input character data.
(装置処理;図3)
 図3を参照して、画像表示装置4の制御部30が実行する装置処理の内容を説明する。配達人D1が画像表示装置4を自身の頭部に装着し、画像表示装置4の電源をオンすると、制御部30は、図3の処理を開始する。
(Device processing; FIG. 3)
With reference to FIG. 3, the content of the apparatus process which the control part 30 of the image display apparatus 4 performs is demonstrated. When the delivery person D1 attaches the image display device 4 to his / her head and turns on the power of the image display device 4, the control unit 30 starts the process of FIG.
 S10では、制御部30は、近距離通信I/F22を介して、キーボード8に所定の入力開始指示が入力されることを監視する。入力開始指示は、文字情報の入力を開始するための指示である。配達人D1は、キーボード8に入力開始指示を入力することができる。制御部30は、キーボード8に入力開始指示が入力される場合に、S10でYESと判断してS12に進む。 In S10, the control unit 30 monitors whether a predetermined input start instruction is input to the keyboard 8 via the short-range communication I / F 22. The input start instruction is an instruction for starting input of character information. The delivery person D1 can input an input start instruction to the keyboard 8. When an input start instruction is input to the keyboard 8, the control unit 30 determines YES in S10 and proceeds to S12.
 S12では、制御部30は、所定の文字表示画面を表示部10に表示させる。文字表示画面は、キーボード8に入力された文字情報(以下では「入力文字情報」と呼ぶ)を表示するための画面である。配達人D1は、文字表示画面を見ながら、キーボード8に文字情報(即ち、提示された身分証明書50に記載された識別文字情報54)を入力する。 In S12, the control unit 30 causes the display unit 10 to display a predetermined character display screen. The character display screen is a screen for displaying character information input to the keyboard 8 (hereinafter referred to as “input character information”). The delivery person D1 inputs character information (that is, identification character information 54 described on the presented identification card 50) to the keyboard 8 while looking at the character display screen.
 S14では、制御部30は、近距離通信I/F22を介して、キーボード8からキーボード8に入力された入力文字情報を表わす入力文字データを取得する。 In S14, the control unit 30 acquires input character data representing input character information input from the keyboard 8 to the keyboard 8 via the short-range communication I / F 22.
 S16では、制御部30は、身分証明書50を表わす画像を示す画像データを取得する。S12、S14において、配達人D1は、表示部10越しに身分証明書50を見ながら、文字情報の入力を行う。その際、カメラ12、14が、配達人D1の視界範囲に存在する身分証明書50の画像を撮影する。これにより、S20において、制御部30は、カメラ12、14によって撮影される身分証明書50の画像を表わす画像データを取得する。 In S16, the control unit 30 acquires image data indicating an image representing the identification card 50. In S <b> 12 and S <b> 14, the delivery person D <b> 1 inputs character information while looking at the identification card 50 through the display unit 10. At that time, the cameras 12 and 14 take an image of the identification card 50 existing in the field of view of the delivery person D1. Thereby, in S20, the control unit 30 acquires image data representing an image of the identification card 50 photographed by the cameras 12 and 14.
 S18では、制御部30は、文字認識プログラム36を利用して、取得済みの画像データによって表される画像に対して文字認識処理を実行する。具体的には、制御部30は、画像の中から、身分証明書50の所定の書式に従って所定の位置に配置されている所定の枠52を検出する。制御部30は、所定の枠52内の識別文字情報54に対して文字認識処理を実行する。これにより、制御部30は、身分証明書50に記載されている識別文字情報54を表わす認識文字データを抽出する。 In S18, the control unit 30 uses the character recognition program 36 to execute character recognition processing on the image represented by the acquired image data. Specifically, the control unit 30 detects a predetermined frame 52 arranged at a predetermined position according to a predetermined format of the identification card 50 from the image. The control unit 30 performs a character recognition process on the identification character information 54 in the predetermined frame 52. Thereby, the control unit 30 extracts recognized character data representing the identification character information 54 described in the identification card 50.
 S20では、制御部30は、ネットワークI/F20を介して、S14で取得された入力文字データとS22で取得された認識文字データの双方をサーバ100に送信する。 In S20, the control unit 30 transmits both the input character data acquired in S14 and the recognized character data acquired in S22 to the server 100 via the network I / F 20.
 S22では、制御部30は、サーバ100から警告指示を受信するのか否かを判断する。警告指示は、入力文字データによって表される入力文字情報と認識文字データによって表される識別文字情報54が一致しないことを示す警告画面を表示部10に表示させるための指示である。警告指示は、サーバ100が入力文字データによって表される入力文字情報と認識文字データによって表される識別文字情報54とが一致しないと判断する場合に、サーバ100から画像表示装置4に送信される。制御部30は、サーバ100から警告指示を受信すると、S22でYESと判断し、S24に進む。S24では、制御部30は、警告指示に従って、所定の警告画面を表示部10に表示させる。S24を終えると、制御部30は、S10の監視に戻る。 In S22, the control unit 30 determines whether or not a warning instruction is received from the server 100. The warning instruction is an instruction for causing the display unit 10 to display a warning screen indicating that the input character information represented by the input character data does not match the identification character information 54 represented by the recognized character data. The warning instruction is transmitted from the server 100 to the image display device 4 when the server 100 determines that the input character information represented by the input character data does not match the identification character information 54 represented by the recognized character data. . When receiving the warning instruction from the server 100, the control unit 30 determines YES in S22, and proceeds to S24. In S24, the control unit 30 causes the display unit 10 to display a predetermined warning screen according to the warning instruction. When S24 ends, the control unit 30 returns to the monitoring of S10.
 一方、制御部30は、サーバ100からOK指示を受信すると、S22でNOと判断し、S26に進む。OK指示は、入力文字データによって表される入力文字情報と認識文字データによって表される識別文字情報54が一致することを示すOK画面を表示部10に表示させるための指示である。OK指示は、サーバ100が入力文字データによって表される入力文字情報と認識文字データによって表される識別文字情報54とが一致すると判断する場合に、サーバ100から画像表示装置4に送信される。S26では、制御部30は、OK指示に従って、所定のOK画面を表示部10に表示させる。S26を終えると、制御部30は、S10の監視に戻る。 On the other hand, when receiving an OK instruction from the server 100, the control unit 30 determines NO in S22, and proceeds to S26. The OK instruction is an instruction for causing the display unit 10 to display an OK screen indicating that the input character information represented by the input character data matches the identification character information 54 represented by the recognized character data. The OK instruction is transmitted from the server 100 to the image display device 4 when the server 100 determines that the input character information represented by the input character data matches the identification character information 54 represented by the recognized character data. In S <b> 26, the control unit 30 displays a predetermined OK screen on the display unit 10 according to the OK instruction. When S26 ends, the control unit 30 returns to the monitoring of S10.
(サーバ処理;図4)
 図4を参照して、サーバ100の制御部130が実行するサーバ処理について説明する。サーバ100の電源をオンすると、制御部130は、図4の処理を開始する。
(Server processing; Fig. 4)
With reference to FIG. 4, server processing executed by the control unit 130 of the server 100 will be described. When the server 100 is turned on, the control unit 130 starts the process of FIG.
 S50では、制御部130は、画像表示装置4から入力文字データと認識文字データを受信することを監視する。制御部130は、画像表示装置4から入力文字データと認識文字データとを所定期間内に続けて受信する場合に、S50でYESと判断してS52に進む。 In S50, the control unit 130 monitors reception of input character data and recognized character data from the image display device 4. When the control unit 130 receives the input character data and the recognized character data from the image display device 4 within a predetermined period, the control unit 130 determines YES in S50 and proceeds to S52.
 S52では、制御部130は、入力文字データによって表される入力文字情報と、認識文字データによって表される識別文字情報54と、を比較し、両者が一致するのか否かを判断する。制御部130は、両者が一致すると判断する場合に、S52でYESと判断してS54に進む。一方、制御部130は、両者が一致しないと判断する場合に、S52でNOと判断してS58に進む。 In S52, the control unit 130 compares the input character information represented by the input character data with the identification character information 54 represented by the recognized character data, and determines whether or not they match. If the control unit 130 determines that both match, the control unit 130 determines YES in S52 and proceeds to S54. On the other hand, when determining that the two do not match, the control unit 130 determines NO in S52 and proceeds to S58.
 S54では、制御部130は、入力文字情報を表わす入力文字データをメモリ32の所定の記憶領域に記憶させる。 In S54, the control unit 130 stores the input character data representing the input character information in a predetermined storage area of the memory 32.
 続くS56では、制御部130は、所定のOK指示を画像表示装置4に送信する。上記の通り、OK指示は、所定のOK画面を画像表示装置4の表示部10に表示させるための指示である。S56を終えると、制御部130は、S50の監視に戻る。 In subsequent S <b> 56, the control unit 130 transmits a predetermined OK instruction to the image display device 4. As described above, the OK instruction is an instruction for displaying a predetermined OK screen on the display unit 10 of the image display device 4. When S56 ends, the control unit 130 returns to the monitoring of S50.
 一方、S58では、制御部130は、所定の警告指示を画像表示装置4に送信する。上記の通り、警告指示は、所定の警告画面を画像表示装置4の表示部10に表示させるための指示である。S58を終えると、制御部130は、S50の監視に戻る。 On the other hand, in S58, the control unit 130 transmits a predetermined warning instruction to the image display device 4. As described above, the warning instruction is an instruction for displaying a predetermined warning screen on the display unit 10 of the image display device 4. When S58 ends, the control unit 130 returns to the monitoring of S50.
 なお、制御部130は、所定のタイミングで、S54で所定の記憶領域に記憶された入力文字データをクレジットカード会社のサービスサーバ(図示省略)に送信する。客C1は、クレジットカードを注文する際に、クレジットカード会社に身分証明書50の識別文字情報54を伝える。クレジットカード会社は、客C1から伝えられた識別文字情報54をサービスサーバに登録する。サービスサーバは、登録済みの識別文字情報54と一致する文字情報を表わす文字データを受信すると、配達された客C1がクレジットカードを注文した本人であることをクレジットカード会社に通知する。 The control unit 130 transmits the input character data stored in the predetermined storage area in S54 to the service server (not shown) of the credit card company at a predetermined timing. When the customer C1 orders a credit card, the customer C1 transmits the identification character information 54 of the identification card 50 to the credit card company. The credit card company registers the identification character information 54 transmitted from the customer C1 in the service server. When the service server receives character data representing character information that matches the registered identification character information 54, the service server notifies the credit card company that the delivered customer C1 is the person who ordered the credit card.
(具体的なケース;図5)
 図5を参照して、図3及び図4の処理によって実現される具体的なケースA1、A2を説明する。
(Specific case: Fig. 5)
With reference to FIG. 5, specific cases A1 and A2 realized by the processes of FIGS. 3 and 4 will be described.
(文字情報が一致しない具体的なケースA1;図5)
 ケースA1は、文字情報(即ち、入力文字情報及び識別文字情報54)が一致しないケースである。T100では、画像表示装置4は、入力開始指示を受け付けることに応じて(図3のS10でYES)、キーボード8を介して、身分証明書50の識別文字情報「123456789」とは異なる入力文字情報「113456789」の入力を受け付ける。T102では、画像表示装置4は、キーボード8から入力文字情報「113456789」を表わす入力文字データを取得する(S14)。
(Specific case A1 in which the character information does not match; FIG. 5)
Case A1 is a case where character information (that is, input character information and identification character information 54) does not match. In T100, the image display device 4 receives input instruction (YES in S10 in FIG. 3), and input character information different from the identification character information “123456789” of the identification card 50 via the keyboard 8. The input of “113456789” is accepted. In T102, the image display device 4 acquires input character data representing the input character information “113456789” from the keyboard 8 (S14).
 T110では、画像表示装置4は、身分証明書50を撮影する。T112では、画像表示装置4は、カメラ12、14によって撮影された身分証明書50を表わす画像を示す画像データを取得する(S16)。 At T110, the image display device 4 photographs the identification card 50. At T112, the image display device 4 acquires image data indicating an image representing the identification card 50 photographed by the cameras 12 and 14 (S16).
 T114では、画像表示装置4は、取得済みの画像データによって表される画像に対して文字認識処理を実行し、身分証明書50に記載されている識別文字情報「123456789」を表わす認識文字データを抽出する(S18)。 At T <b> 114, the image display device 4 executes character recognition processing on the image represented by the acquired image data, and recognizes recognized character data representing the identification character information “123456789” described in the identification card 50. Extract (S18).
 T120では、画像表示装置4は、T102で取得された入力文字データとT114で抽出された認識文字データの双方をサーバ100に送信する(S20)。 At T120, the image display device 4 transmits both the input character data acquired at T102 and the recognized character data extracted at T114 to the server 100 (S20).
 T130では、サーバ100は、入力文字データによって表される入力文字情報「113456789」と、認識文字データによって表される識別文字情報「123456789」と、が一致しないと判断する(図4のS52でNO)。続くT134では、サーバ100は、警告指示を画像表示装置4に送信する(S58)。 At T130, the server 100 determines that the input character information “113456789” represented by the input character data does not match the identification character information “123456789” represented by the recognized character data (NO in S52 of FIG. 4). ). In subsequent T134, the server 100 transmits a warning instruction to the image display device 4 (S58).
 T140では、画像表示装置4は、サーバ100から受信された警告指示(図3のS22でYES)に従って、警告画面を表示部10に表示させる(S24)。配達人D1は、表示部10に表示された警告画面を見て、配達人D1が入力した入力文字情報(T100)が識別文字情報54と一致しないことを知ることができる。 At T140, the image display device 4 displays a warning screen on the display unit 10 in accordance with the warning instruction received from the server 100 (YES in S22 of FIG. 3) (S24). The delivery person D1 can know that the input character information (T100) input by the delivery person D1 does not match the identification character information 54 by looking at the warning screen displayed on the display unit 10.
(文字情報が一致する具体的なケースA2;図5)
 ケースA2は、文字情報(即ち、入力文字情報及び識別文字情報54)が一致するケースである。T200では、画像表示装置4は、入力開始指示を受け付けることに応じて(図3のS10でYES)、キーボード8を介して、身分証明書50の識別文字情報「123456789」と同じ入力文字情報「123456789」の入力を受け付ける。T202では、画像表示装置4は、キーボード8から入力文字情報「123456789」を表わす入力文字データを取得する(S14)。
(Specific case A2 in which character information matches; FIG. 5)
Case A2 is a case where character information (that is, input character information and identification character information 54) matches. In T200, the image display device 4 accepts the input start instruction (YES in S10 of FIG. 3), and the input character information “123456789” that is the same as the identification character information “123456789” of the identification card 50 is received via the keyboard 8. "123456789" is received. In T202, the image display device 4 acquires input character data representing the input character information “123456789” from the keyboard 8 (S14).
 T210~T220は、T110~T120と同じである。T230では、サーバ100は、入力文字データによって表される入力文字情報「123456789」と、認識文字データによって表される識別文字情報「123456789」と、が一致すると判断する(図4のS52でYES)。続くT232では、サーバ100は、入力文字データをメモリ32の所定の記憶領域に記憶させる(S54)。T234では、サーバ100は、OK指示を画像表示装置4に送信する(S56)。 T210 to T220 are the same as T110 to T120. In T230, the server 100 determines that the input character information “123456789” represented by the input character data matches the identification character information “123456789” represented by the recognized character data (YES in S52 of FIG. 4). . In subsequent T232, the server 100 stores the input character data in a predetermined storage area of the memory 32 (S54). In T234, the server 100 transmits an OK instruction to the image display device 4 (S56).
 T240では、画像表示装置4は、サーバ100から受信されたOK指示(図3のS22でNO)に従って、OK画面を表示部10に表示させる(S26)。配達人D1は、表示部10に表示されたOK画面を見て、配達人D1が入力した入力文字情報(T200)が識別文字情報54と一致していることを知ることができる。 At T240, the image display device 4 displays an OK screen on the display unit 10 in accordance with the OK instruction received from the server 100 (NO in S22 of FIG. 3) (S26). The delivery person D1 can know that the input character information (T200) input by the delivery person D1 matches the identification character information 54 by looking at the OK screen displayed on the display unit 10.
 T250では、サーバ100は、メモリ32の所定の記憶領域に記憶された入力文字データをクレジットカード会社のサービスサーバに送信する。 At T250, the server 100 transmits the input character data stored in the predetermined storage area of the memory 32 to the service server of the credit card company.
 以上、本実施例の画像表示装置4の構成及び動作について説明した。上記の通り、本実施例の情報処理システム2では、文字認識処理により画像から抽出された認識文字データによって表される識別文字情報54と、キーボード8に入力される入力文字情報と、が一致しない場合(図5のT130)に、警告画面が画像表示装置4の表示部10に表示される(T140)。これにより、配達人D1は、警告画面を見て、配達人D1がキーボード8に入力した入力文字情報(T100)が身分証明書50に記載されている識別文字情報54と一致しないことを知ることができる。 The configuration and operation of the image display device 4 according to the present embodiment have been described above. As described above, in the information processing system 2 of the present embodiment, the identification character information 54 represented by the recognized character data extracted from the image by the character recognition process does not match the input character information input to the keyboard 8. In this case (T130 in FIG. 5), a warning screen is displayed on the display unit 10 of the image display device 4 (T140). Accordingly, the delivery person D1 sees the warning screen and knows that the input character information (T100) input by the delivery person D1 on the keyboard 8 does not match the identification character information 54 described in the identification card 50. Can do.
 また、本実施例では、情報処理システム2は、識別文字情報54と入力文字情報が一致する場合(T230)に、入力文字情報を表わす入力文字データをサーバ100のメモリ32に記憶する(T232)。情報処理システム2は、識別文字情報54を示す正しい文字データを記憶することができる。 In the present embodiment, the information processing system 2 stores the input character data representing the input character information in the memory 32 of the server 100 when the identification character information 54 matches the input character information (T230) (T232). . The information processing system 2 can store correct character data indicating the identification character information 54.
 また、本実施例では、画像表示装置4は、文字認識処理を実行し(図3のS18)、文字認識処理により抽出された認識文字データをサーバ100に送信する(S20)。このため、画像表示装置が、カメラで撮影された画像を示す画像データをサーバ100に送信し、サーバ100に文字認識処理を実行させる他の構成と比較して、画像表示装置4とサーバ100との間の通信負荷を小さくすることができる。 In this embodiment, the image display device 4 executes character recognition processing (S18 in FIG. 3), and transmits the recognized character data extracted by the character recognition processing to the server 100 (S20). Therefore, the image display device 4 and the server 100 are compared with other configurations in which the image display device transmits image data indicating an image captured by the camera to the server 100 and causes the server 100 to perform character recognition processing. The communication load during the period can be reduced.
 また、本実施例では、画像表示装置4は、ユーザの頭部に装着可能なヘッドマウントディスプレイである。そのため、配達人D1が画像表示装置4を装着した状態で身分証明書50を見れば(即ち視界範囲に入れれば)、画像表示装置4は、身分証明書50を表わす画像を示す画像データを取得することができる(図3のS16)。 In this embodiment, the image display device 4 is a head mounted display that can be mounted on the user's head. Therefore, if the delivery person D1 sees the identification card 50 with the image display device 4 attached (that is, enters the field of view), the image display device 4 acquires image data indicating an image representing the identification card 50. (S16 in FIG. 3).
(対応関係)
 情報処理システム2が「情報処理装置」の一例である。画像表示装置4とキーボード8との組合せが「端末装置」の一例である。カメラ12、14、キーボード8、表示部10が、それぞれ「画像形成部」「入力部」「報知部」の一例である。身分証明書50、識別文字情報54、認識文字データが、それぞれ「文書」「第1の文字情報」「第1の文字データ」の一例である。入力文字情報、入力文字データが、それぞれ「第2の文字情報」「第2の文字データ」の一例である。表示部10に警告画面を表示すること(図3のS24)が「報知動作」の一例である。サーバ100のメモリ132が「記憶装置」の一例である。図3のS40の警告指示が「報知指示」の一例である。
(Correspondence)
The information processing system 2 is an example of an “information processing apparatus”. A combination of the image display device 4 and the keyboard 8 is an example of a “terminal device”. The cameras 12 and 14, the keyboard 8, and the display unit 10 are examples of “image forming unit”, “input unit”, and “notification unit”, respectively. The identification card 50, the identification character information 54, and the recognized character data are examples of “document”, “first character information”, and “first character data”, respectively. The input character information and the input character data are examples of “second character information” and “second character data”, respectively. Displaying a warning screen on the display unit 10 (S24 in FIG. 3) is an example of “notification operation”. The memory 132 of the server 100 is an example of a “storage device”. The warning instruction in S40 of FIG. 3 is an example of “notification instruction”.
 以上、実施例を詳細に説明したが、これらは例示に過ぎず、特許請求の範囲を限定するものではない。特許請求の範囲に記載の技術には、以上に例示した具体例を様々に変形、変更したものが含まれる。例えば、以下の変形例を採用してもよい。 As mentioned above, although the Example was described in detail, these are only illustrations and do not limit a claim. The technology described in the claims includes various modifications and changes of the specific examples illustrated above. For example, the following modifications may be adopted.
(変形例1)サーバ100は、図4のS54で、入力文字データをメモリ132に記憶することなく、入力文字データをクレジットカード会社のサービスサーバに送信してもよい。本変形例では、「記憶制御部」を省略可能である。 (Modification 1) The server 100 may transmit the input character data to the service server of the credit card company without storing the input character data in the memory 132 in S54 of FIG. In the present modification, the “storage control unit” can be omitted.
(変形例2)情報処理システム2は、画像表示装置のみを備え、サーバ100を備えなくてもよい。即ち、本変形例では、画像表示装置4が、上記の実施例の画像表示装置4とサーバ100との両方の機能を発揮してもよい。この場合、制御部30は、図3のS20に代えて、図4のS52の処理を実行してもよい。制御部30は、入力文字情報と識別文字情報54が一致すると判断する場合に、表示部10にOK画面を表示させ、入力文字情報と識別文字情報54が一致しないと判断する場合に、表示部10に警告画面を表示させてもよい。本変形例では、画像表示装置4が、「情報処理装置」の一例であり、「第1の受信部」、「第2の受信部」、「判断部」、「送信部」を省略可能である。 (Modification 2) The information processing system 2 includes only the image display device and does not need to include the server 100. That is, in this modification, the image display device 4 may exhibit both functions of the image display device 4 and the server 100 of the above-described embodiment. In this case, the control unit 30 may execute the process of S52 of FIG. 4 instead of S20 of FIG. When determining that the input character information and the identification character information 54 match, the control unit 30 displays an OK screen on the display unit 10, and when determining that the input character information and the identification character information 54 do not match, the display unit 10 may display a warning screen. In the present modification, the image display device 4 is an example of an “information processing device”, and the “first reception unit”, “second reception unit”, “determination unit”, and “transmission unit” can be omitted. is there.
(変形例3)図3の装置処理では、制御部30は、S16、S18の処理(即ち、画像データ及び認識文字データの取得)を実行した後に、S10~S14の処理(入力文字データの取得)を実行してもよい。即ち、図5のT110~T114を実行した後に、T100、T102を実行してもよい。また、制御部30は、S20で入力文字データと認識文字データを送信する代わりに、S14の後に、入力文字データを送信し、S18の後に、認識文字データを送信してもよい。即ち、制御部30は、入力文字データと認識文字データを別のタイミングで送信してもよい。 (Modification 3) In the apparatus processing of FIG. 3, the control unit 30 performs the processing of S10 to S14 (acquisition of input character data) after executing the processing of S16 and S18 (that is, acquisition of image data and recognized character data). ) May be executed. That is, T100 and T102 may be executed after executing T110 to T114 of FIG. Further, instead of transmitting the input character data and the recognized character data in S20, the control unit 30 may transmit the input character data after S14 and transmit the recognized character data after S18. That is, the control unit 30 may transmit the input character data and the recognized character data at different timings.
(変形例4)「端末装置」は、カメラと表示装置を備えたスマートフォン、タブレット端末等の携帯端末であってもよい。また、端末装置は、カメラに代えてスキャナを備えていてもよい。その場合、端末装置の制御部は、スキャナに身分証明書のスキャンを実行させることによって、身分証明書を表わす画像の画像データを取得してもよい。この場合、スキャナが、「画像形成部」の一例である。 (Modification 4) The “terminal device” may be a mobile terminal such as a smartphone or a tablet terminal provided with a camera and a display device. In addition, the terminal device may include a scanner instead of the camera. In this case, the control unit of the terminal device may acquire image data of an image representing the identification card by causing the scanner to scan the identification card. In this case, the scanner is an example of an “image forming unit”.
(変形例5)上記の実施例では、画像表示装置4は、いずれも、略眼鏡状の支持枠を有しており、眼鏡を装着するようにユーザの頭部に装着可能なものである。これに限られず、画像表示装置は、ユーザの頭部に装着可能であれば、帽子状、ヘルメット状等、任意の支持枠を有していてもよい。 (Modification 5) In the above embodiment, each of the image display devices 4 has a substantially glasses-like support frame, and can be worn on the user's head like wearing glasses. The image display device is not limited to this, and may have an arbitrary support frame such as a hat shape or a helmet shape as long as the image display device can be mounted on the user's head.
(変形例6)画像表示装置4は、視力矯正、目の保護等の用途のために一般的に使用されているアイウェア(眼鏡、サングラス等)に、第1のカメラ12、第2のカメラ14、及び、コントロールボックス16を装着することによって形成されてもよい。その場合、アイウェアのレンズ部分が表示部として利用されてもよい。 (Modification 6) The image display device 4 includes a first camera 12 and a second camera in eyewear (glasses, sunglasses, etc.) generally used for purposes such as vision correction and eye protection. 14 and the control box 16 may be attached. In that case, the lens portion of the eyewear may be used as the display unit.
(変形例7)画像表示装置4の表示部10は、遮光性のディスプレイであって、ユーザが画像表示装置を装着した際に、ユーザの視界を遮るものであってもよい。表示部10が遮光性のディスプレイであるため、画像表示装置4の電源がオンされると、制御部30は、ユーザの右眼に対向する領域に第1のカメラ12が撮影する画像を表示させ、ユーザの左眼に対向する領域に第2のカメラ14が撮影する画像を表示させればよい。 (Modification 7) The display unit 10 of the image display device 4 may be a light-shielding display, and may block the user's field of view when the user wears the image display device. Since the display unit 10 is a light-shielding display, when the image display device 4 is turned on, the control unit 30 displays an image captured by the first camera 12 in a region facing the user's right eye. The image captured by the second camera 14 may be displayed in a region facing the user's left eye.
(変形例8)上記の実施例では、情報処理システム2は、クレジットカードを配達する状況で利用される例について説明した。本明細書で開示する技術は、身分証明書の提示を求める他の状況においても利用可能である。例えば、役所が住民に証明書を発行する状況、銀行が個人に口座を開設する状況、荷物の受取主が不在で受け取ることができなかった荷物を配達業者の窓口で受け取る状況である。 (Modification 8) In the above embodiment, the information processing system 2 has been described as being used in a situation where a credit card is delivered. The technology disclosed in the present specification can be used in other situations that require presentation of an identification card. For example, a situation where a government office issues a certificate to residents, a situation where a bank opens an account for an individual, and a situation where a parcel recipient cannot receive a parcel at the delivery company's window.
 また、本明細書または図面に説明した技術要素は、単独であるいは各種の組合せによって技術的有用性を発揮するものであり、出願時請求項記載の組合せに限定されるものではない。また、本明細書または図面に例示した技術は複数目的を同時に達成するものであり、そのうちの一つの目的を達成すること自体で技術的有用性を持つものである。 Further, the technical elements described in the present specification or drawings exhibit technical usefulness alone or in various combinations, and are not limited to the combinations described in the claims at the time of filing. In addition, the technology illustrated in the present specification or the drawings achieves a plurality of objects at the same time, and has technical utility by achieving one of the objects.

Claims (4)

  1.  画像形成部と、
     入力部と、
     報知部と、
     制御部と、を備える情報処理装置であって、
     前記制御部は、
      前記画像形成部によって形成される画像であって、第1の文字情報が記載された文書を表わす前記画像を示す画像データを取得する画像取得部と、
      取得済みの前記画像データによって表される前記画像に対して文字認識処理を実行することにより、前記画像中の前記文書に記載されている前記第1の文字情報を表わす第1の文字データを抽出する抽出部と、
      前記入力部に入力される第2の文字情報を表わす第2の文字データを取得する入力データ取得部と、
      前記第1の文字データによって表される前記第1の文字情報と、前記第2の文字データによって表される第2の文字情報と、が一致しない特定の場合に、前記報知部に報知動作を実行させる報知制御部と、を備える、
     情報処理装置。
    An image forming unit;
    An input section;
    A notification unit;
    An information processing apparatus comprising a control unit,
    The controller is
    An image acquisition unit that acquires an image data that is an image formed by the image forming unit and that represents the image representing a document in which first character information is described;
    First character data representing the first character information described in the document in the image is extracted by executing character recognition processing on the image represented by the acquired image data An extractor to perform,
    An input data acquisition unit for acquiring second character data representing second character information input to the input unit;
    In a specific case where the first character information represented by the first character data and the second character information represented by the second character data do not match, a notification operation is performed on the notification unit. A notification control unit to be executed,
    Information processing device.
  2.  前記制御部は、さらに、前記第1の文字データによって表される前記第1の文字情報と、前記第2の文字データによって表される第2の文字情報と、が一致する場合に、前記第2の文字データを記憶装置に記憶させる記憶制御部を備える、
     請求項1に記載の情報処理装置。
    The control unit further includes the first character information represented by the first character data and the second character information represented by the second character data when the first character information matches the second character information. A storage control unit for storing the character data of 2 in the storage device;
    The information processing apparatus according to claim 1.
  3.  前記情報処理装置は、端末装置と、前記端末装置と通信可能なサーバと、を含み、
     前記端末装置は、前記画像形成部と、前記入力部と、前記報知部と、端末制御部と、を備え、
     前記端末制御部は、前記画像取得部と、前記抽出部と、前記入力データ取得部と、前記報知制御部と、を備えており、
      前記抽出部は、さらに、抽出された前記第1の文字データを前記サーバに送信し、
      前記入力データ取得部は、さらに、取得された前記第2の文字データを前記サーバに送信し、
      前記報知制御部は、前記第1の文字情報と前記第2の文字情報とが一致せず、かつ、前記サーバから報知指示を受信する前記特定の場合に、前記報知部に前記報知動作を実行させ、
     前記サーバは、サーバ制御部を備え、
     前記サーバ制御部は、
      前記端末装置から前記第1の文字データを受信する第1の受信部と、
      前記端末装置から前記第2の文字データを受信する第2の受信部と、
      受信済みの前記第1の文字データによって表される前記第1の文字情報と受信済みの前記第2の文字データによって表される前記第2の文字情報とが一致するか否かを判断する判断部と、
      前記判断部において前記第1の文字情報と前記第2の文字情報とが一致しないと判断される場合に、前記報知指示を前記端末装置に送信する、送信部と、を備える、
     請求項1又は2に記載の情報処理装置。
    The information processing device includes a terminal device and a server capable of communicating with the terminal device,
    The terminal device includes the image forming unit, the input unit, the notification unit, and a terminal control unit,
    The terminal control unit includes the image acquisition unit, the extraction unit, the input data acquisition unit, and the notification control unit,
    The extraction unit further transmits the extracted first character data to the server,
    The input data acquisition unit further transmits the acquired second character data to the server,
    The notification control unit performs the notification operation on the notification unit in the specific case where the first character information and the second character information do not match and the notification instruction is received from the server. Let
    The server includes a server control unit,
    The server control unit
    A first receiving unit for receiving the first character data from the terminal device;
    A second receiving unit for receiving the second character data from the terminal device;
    Judgment whether or not the first character information represented by the received first character data and the second character information represented by the received second character data match. And
    A transmission unit that transmits the notification instruction to the terminal device when the determination unit determines that the first character information and the second character information do not match;
    The information processing apparatus according to claim 1 or 2.
  4.  前記端末装置は、さらに、ユーザの頭部に装着可能なフレームを備え、
     前記画像形成部は、前記フレームに搭載され、前記フレームを装着した前記ユーザの視界範囲に対応する範囲を撮影可能なカメラを含み、
     前記報知部は、前記フレームに搭載され、前記フレームを装着した前記ユーザの右眼と左眼の少なくとも一方に対向する位置に配置される表示装置を含み、
     前記報知動作は、前記表示装置に、前記第1の文字情報と前記第2の文字情報とが一致しないことに関係する報知情報を表示させることを含む、
     請求項3に記載の情報処理装置。
     
    The terminal device further includes a frame that can be worn on a user's head,
    The image forming unit includes a camera mounted on the frame and capable of photographing a range corresponding to the field of view of the user wearing the frame,
    The notification unit includes a display device mounted on the frame and disposed at a position facing at least one of the right eye and the left eye of the user wearing the frame,
    The notification operation includes causing the display device to display notification information related to the fact that the first character information and the second character information do not match.
    The information processing apparatus according to claim 3.
PCT/JP2016/088185 2016-12-21 2016-12-21 Information processing device WO2018116422A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088185 WO2018116422A1 (en) 2016-12-21 2016-12-21 Information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088185 WO2018116422A1 (en) 2016-12-21 2016-12-21 Information processing device

Publications (1)

Publication Number Publication Date
WO2018116422A1 true WO2018116422A1 (en) 2018-06-28

Family

ID=62626068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/088185 WO2018116422A1 (en) 2016-12-21 2016-12-21 Information processing device

Country Status (1)

Country Link
WO (1) WO2018116422A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05274467A (en) * 1992-03-25 1993-10-22 Toshiba Corp Data input device
JP2000331006A (en) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Information retrieval device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05274467A (en) * 1992-03-25 1993-10-22 Toshiba Corp Data input device
JP2000331006A (en) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Information retrieval device

Similar Documents

Publication Publication Date Title
US20180234244A1 (en) Password Management
CN110214320B (en) Authentication using facial image comparison
US10943229B2 (en) Augmented reality headset and digital wallet
US9792594B1 (en) Augmented reality security applications
EP3242262A1 (en) Payment authorization method and device
US10269016B2 (en) Dual biometric automatic teller machine (“ATM”) session initialization having single in-line session maintenance
EP3136275A1 (en) Digital authentication using augmented reality
US11062015B2 (en) Authentication management method, information processing apparatus, wearable device, and computer program
WO2020022014A1 (en) Information processing device, information processing method, and information processing program
US20180150844A1 (en) User Authentication and Authorization for Electronic Transaction
CA3112331A1 (en) Remotely verifying an identity of a person
EP3594879A1 (en) System and method for authenticating transactions from a mobile device
CN108230139B (en) Method and system for deposit and account opening by using self-service equipment
TWM566865U (en) Transaction system based on face recognitioin for verification
US20200184056A1 (en) Method and electronic device for authenticating a user
KR101813950B1 (en) An automated teller machine and a method for operating it
WO2018116422A1 (en) Information processing device
JP2015041132A (en) Information processing apparatus and information processing program
JP2020021458A (en) Information processing apparatus, information processing method, and information processing system
KR101742064B1 (en) A terminal for providing banking services, a method for operating the terminal, a server for providing banking services and a method for operatin the server
US20230005301A1 (en) Control apparatus, control method, and non-transitory computer readable medium
KR20060117865A (en) Image security apparatus in automatic tellex machine and method thereof
JP6403975B2 (en) Confidential information input system and program
JP2020135387A (en) Agent terminal for contact center system
RU2601140C2 (en) Method for providing trusted execution environment of performing analogue-to-digital signature and device for its implementation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924635

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924635

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP