WO2015178073A1 - Dispositif de traitement d'informations, dispositif de gestion, procédé de traitement d'informations, et programme - Google Patents

Dispositif de traitement d'informations, dispositif de gestion, procédé de traitement d'informations, et programme Download PDF

Info

Publication number
WO2015178073A1
WO2015178073A1 PCT/JP2015/057462 JP2015057462W WO2015178073A1 WO 2015178073 A1 WO2015178073 A1 WO 2015178073A1 JP 2015057462 W JP2015057462 W JP 2015057462W WO 2015178073 A1 WO2015178073 A1 WO 2015178073A1
Authority
WO
WIPO (PCT)
Prior art keywords
character
recognized
information
imaging
target image
Prior art date
Application number
PCT/JP2015/057462
Other languages
English (en)
Japanese (ja)
Inventor
山田 英樹
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2015178073A1 publication Critical patent/WO2015178073A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns

Definitions

  • the present disclosure relates to an information processing device, a management device, an information processing method, and a program.
  • a technology for processing a captured image using the location and time at which the image was captured has been developed.
  • a technique for classifying captured images using the location and time at which the images are captured for example, a technique described in Patent Document 1 below can be cited.
  • an image captured by an imaging device is a target for recognizing characters from the image as the functionality of an imaging device mounted on a device such as a smartphone or a mobile phone increases.
  • Opportunities are increasing.
  • a character is recognized from the captured image, for example, when the character is recognized from the document image indicating the document because the character is distorted or part of the character is missing. Rather than misrecognizing characters. Therefore, there is a demand for improvement in recognition accuracy when characters are recognized from a captured image.
  • This disclosure proposes a new and improved information processing apparatus, management apparatus, information processing method, and program capable of improving the recognition accuracy of characters recognized from captured images.
  • the information related to the imaging and the character are associated with each other based on the information related to the imaging including the information indicating the captured position corresponding to the target image that is the captured image of the target from which the character is recognized.
  • a character candidate extraction control unit that extracts characters corresponding to the imaging-related information as character candidates that are candidate characters recognized from the target image
  • a character determination processing unit that selectively corrects characters recognized from the target image and determines characters to be recognized from the target image.
  • the character stored in the reference information includes a captured image in the past.
  • An information processing apparatus is provided that includes characters based on the characters recognized from.
  • a management device when the character recognized from the captured image and the information related to the imaging including the information indicating the captured position corresponding to the captured image are acquired, A management device is provided that includes a processing unit that records information related to the imaging in association with each other.
  • the information related to the imaging and the character correspond to each other based on the information related to the imaging including the information indicating the captured position corresponding to the target image that is the target captured image on which the character is recognized. Extracting the character corresponding to the information related to the imaging from the reference information to be stored as a character candidate that is a candidate for the character to be recognized from the target image, and based on the extracted character candidate, the target Selectively correcting characters recognized from the image and determining characters to be recognized from the target image, and the character stored in the reference information has been recognized from the captured image in the past.
  • an information processing method executed by an information processing apparatus which includes characters based on characters.
  • the information related to the imaging and the character correspond to each other based on the information related to the imaging including the information indicating the captured position corresponding to the target image that is the target captured image on which the character is recognized.
  • a step of extracting a character corresponding to the information related to the imaging from the reference information that is stored as a character candidate that is a candidate for a character to be recognized from the target image, and based on the extracted character candidate, the target image And selectively correcting the characters recognized from the image and determining the characters to be recognized from the target image, and the characters stored in the reference information were previously recognized from the captured image.
  • a program is provided that includes characters based on characters.
  • FIG. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus 10 that performs processing to recognize a character from a captured image.
  • the image processing apparatus 10 includes a character area detection processing unit 12, a character recognition processing unit 14, and a character determination processing unit 16.
  • a processor configured by an arithmetic circuit such as an MPU (Micro Processing Unit) serves as a character area detection processing unit 12, a character recognition processing unit 14, and a character determination processing unit 16.
  • the character area detection processing unit 12, the character recognition processing unit 14, and the character determination processing unit 16 may be configured by a dedicated (or general-purpose) circuit capable of realizing the processing of each unit.
  • the character area detection processing unit 12 detects a character area including characters from the captured image.
  • the character area detection processing unit 12 detects the character area using an arbitrary method capable of detecting the character area, such as a method of using a character interval or a character size.
  • the character area detection processing unit 12 performs processing (preprocessing) such as binarization and noise removal on the captured image, for example, and processes the processed image so that the subsequent processing can be easily performed. It is also possible to detect a character area.
  • the character area detection processing unit 12 may classify the captured image into a character area and an area including a figure, for example.
  • the character recognition processing unit 14 recognizes a character from the detected character area. For example, the character recognition processing unit 14 divides the character area for each character after performing distortion correction and line division on the character area. In addition, the character recognition processing unit 14 normalizes the divided areas because the size of the divided areas may vary depending on the font or blur. Then, the character recognition processing unit 14 extracts, for example, a feature amount indicating the feature of the character from the normalized area, and from a pattern dictionary (for example, a pattern dictionary database) in which the feature amount and the character are stored in association with each other. The character is recognized from each area by specifying the character corresponding to the extracted feature amount.
  • a pattern dictionary for example, a pattern dictionary database
  • the character determination processing unit 16 corrects the characters recognized from the character area and determines the characters recognized from the captured image. For example, the character determination processing unit 16 specifies a word corresponding to a character recognized from the character region from a word dictionary (for example, a word database) in which the word is stored, and corrects the character recognized from the character region. .
  • the correction by the character determination processing unit 16 realizes, for example, correction of erroneously recognized characters and removal of erroneously detected characters. Then, the character determination processing unit 16 determines the corrected character as a character recognized from the captured image.
  • the image processing apparatus 10 recognizes characters from the captured image, for example, with the configuration shown in FIG.
  • the image processing apparatus 10 corrects the character recognized from the character area by using the word dictionary in the character determination processing unit 16. Therefore, there is a possibility that the character recognition accuracy can be improved when the character is recognized from the captured image.
  • the processing in the character determination processing unit 16 is effective in correcting recognized unknown characters, but for example, existing different words such as existing different words. If it is recognized as a character string, it may happen that correction cannot be performed. Therefore, even if the process according to the first example is performed, it does not necessarily lead to an improvement in character recognition accuracy.
  • the image processing apparatus 10 when the image processing apparatus 10 shown in FIG. 1 performs the processing according to the second example, the image processing apparatus 10 includes information indicating a position and words such as a place name, a store name, and a building name. A word corresponding to information indicating a position where the captured image is captured is specified from a dictionary (for example, a map database) stored in association with each other. The image processing apparatus 10 corrects the character recognized from the character region by further using the word specified based on the information indicating the position where the captured image is captured. Then, the image processing apparatus 10 determines the corrected character as a character recognized from the captured image.
  • a dictionary for example, a map database
  • the processing according to the second example is effective for character recognition in a building or the like that exists in the vicinity of the imaged position, but for example, a building or the like that exists at a position away from the imaged position. It cannot be applied to recognition of characters related to guidance to a place (for example, characters of a guidance guide sign) or characters indicating a distant place name such as “direction XX”. Therefore, even if the process according to the second example is used, it does not necessarily lead to an improvement in character recognition accuracy.
  • the information processing apparatus is, for example, a captured image of a target for which a character is recognized, extracted from reference information in which characters based on characters recognized in the past from the captured image are stored.
  • the character recognized from the captured image is selectively corrected using the character candidate recognized from the image.
  • the information processing apparatus determines the selectively corrected character as a character recognized from the captured image.
  • a captured image of a target whose character is recognized will be referred to as a “target image”.
  • a character candidate recognized from the target image is referred to as a “character candidate”.
  • the reference information according to the present embodiment refers to one or two or more information processing apparatuses according to the present embodiment, in which characters based on characters recognized from a captured image in the past are stored in association with information regarding imaging. Data (or data group).
  • the reference information according to the present embodiment plays a role of so-called collective intelligence for each information processing apparatus according to the present embodiment to obtain a character candidate.
  • the reference information according to the present embodiment includes, for example, a database (or table) in which information related to imaging and characters are stored in association with each other.
  • the reference information according to the present embodiment is not limited to the above, and may be data (or a data group) according to an arbitrary method capable of storing information related to imaging and characters in association with each other.
  • Examples of characters stored in the reference information according to the present embodiment include one character, a word composed of a plurality of characters, a sentence, a sentence, and the like.
  • the “character based on characters recognized from the captured image in the past” according to the present embodiment is, for example, a captured image before the processing related to the information processing method according to the present embodiment is performed on the target image. Or a character in which a character recognized from the captured image is manually changed by a user or the like who manages reference information.
  • the information related to imaging according to the present embodiment is data related to imaging of a captured image including information indicating a captured position.
  • Examples of the information indicating the imaged position according to the present embodiment include data indicating the position, such as data indicating latitude and longitude.
  • the information indicating the captured position according to the present embodiment is not limited to data indicating latitude and longitude, and the position captured by an apparatus such as the information processing apparatus according to the present embodiment can be specified. Arbitrary data may be sufficient.
  • the information included in the information related to imaging according to the present embodiment is not limited to information indicating a captured position.
  • the information related to imaging according to the present embodiment may further include one or more of various types of information shown below.
  • various information 1 or 2 or more shown below that may be included in the information related to imaging according to the present embodiment may be collectively referred to as “additional information”.
  • Information related to the imaging device for example, data indicating imaging settings in the imaging device such as zoom magnification, angle of view, and focal length
  • Information relating to the area to be imaged for example, data indicating the imaged direction, data indicating the imaged orientation, data indicating the elevation angle of the imaging device at the time of imaging, data indicating the altitude of the imaging device at the time of imaging, Data indicating the angular velocity of the imaging device during imaging
  • Information about the situation to be imaged data indicating the acceleration of the imaging device at the time of imaging
  • Information related to the time of image capture data indicating the time of image capture
  • the additional information according to the present embodiment can include, for example, data that can be used to identify an area where the target image may be captured (hereinafter, referred to as “image-capable area”). .
  • the imageable area for example, “characters seen from the expressway and characters seen from the overhead” or “characters seen from the roof of the building and characters seen from the first floor” (for example, data indicating altitude). ), “Characters close to and far from the camera”, “large characters and small characters” (for example, data indicating the focal length setting) are more easily recognized. .
  • the imaging device for example, imaging while walking or imaging while moving with a vehicle.
  • the situation where an image is captured for example, it becomes easier to recognize a character that a pedestrian can see and a character that can be seen from a train.
  • the time taken by the image pickup device or the time zone can be specified, so that the environment taken by the image pickup device can be limited.
  • the imaged environment it becomes easier to recognize, for example, “street display characters illuminated only at night”, “lunch menu and dinner menu”, “signboard characters that vary depending on the season”, and the like.
  • the additional information when the additional information is included in the reference information, the additional information corresponds to a character candidate extracted from the reference information corresponding to a character corresponding to a limited imageable area or a limited imaged situation. It is possible to narrow down the characters to characters corresponding to the limited imaged environment. That is, when additional information is included in the reference information, character candidates extracted from the reference information can be filtered by the additional information.
  • 2A and 2B are explanatory diagrams showing examples of reference information according to the present embodiment.
  • 2A and 2B show an example in which the reference information is a database (or table).
  • the reference information includes, for example, characters (A in FIG. 2A, A in FIG. 2B), information indicating the captured position (B in FIG. 2A, B in FIG. 2B), and additional information (C in FIG. 2A, FIG. 2B C) is stored in association with each other. Further, for example, as shown in FIGS. 2A and 2B, data indicating the language of characters (D in FIG. 2A and D in FIG. 2B) may be stored in the reference information in association with characters.
  • Examples of characters stored in the reference information include place names as shown in FIG. 2A.
  • stored in reference information is not restricted to a place name. For example, even if the reference information stores arbitrary characters that can be included in the captured image, such as a store name such as “Sakura” or “Black pig pork bone shabu-shabu” in FIG. Good.
  • reference information includes data indicating a direction (for example, data indicating an angle with respect to a reference direction), data indicating an altitude, data indicating an acceleration, data indicating an angular velocity, and so on.
  • a direction for example, data indicating an angle with respect to a reference direction
  • data indicating an altitude for example, data indicating an acceleration
  • data indicating an angular velocity for example, an acceleration
  • the information processing apparatus selectively corrects characters recognized from the captured image using the character candidates, and determines the characters recognized from the captured image. Therefore, when additional information is included in the reference information, the recognition accuracy of characters recognized from the captured image can be further improved.
  • the reference information according to the present embodiment is not limited to the examples illustrated in FIGS. 2A and 2B.
  • the reference information according to the present embodiment includes one or both of additional information (C in FIG. 2A and C in FIG. 2B) and data indicating the language of the characters (D in FIG. 2A and D in FIG. 2B). It does not have to be included.
  • the information processing apparatus performs, for example, (1) character candidate extraction control processing and (2) character determination processing as processing related to the information processing method according to the present embodiment.
  • Character candidate extraction control processing causes characters corresponding to information related to imaging to be extracted as character candidates from the reference information based on information related to imaging corresponding to the target image.
  • the information processing apparatus acquires, for example, information regarding imaging from an imaging device that has performed imaging, and uses the acquired information regarding imaging as information regarding imaging corresponding to the target image.
  • the information processing apparatus receives, for example, information related to imaging, which is transmitted mainly by the imaging device that has performed imaging, via the communication unit (described later) or the like, thereby information related to imaging from the imaging device. To get.
  • the information processing apparatus causes an imaging device that has performed imaging to transmit a transmission request for information related to imaging corresponding to the target image, and the imaging device transmits in response to the transmission request.
  • Information regarding imaging may be acquired from the imaging device by receiving information regarding imaging via a communication unit (described later) or the like.
  • a transmission command for information related to imaging and information indicating the target image for example, data indicating the name of the target image, ID of the target image, time stamp of the target image, etc. are specified. Possible data).
  • the information processing apparatus for example, detects detection data of various devices related to generation of information related to imaging included in the imaging device or various devices related to generation of information related to imaging connected to the imaging device from the imaging device.
  • the acquired detection data is used as information relating to imaging.
  • the information processing apparatus acquires, for example, data indicating imaging settings (an example of information regarding imaging devices) from the imaging device, and uses the acquired data indicating the imaging settings as information regarding imaging. Use.
  • examples of various devices related to generation of information related to imaging include the devices shown below.
  • the device related to generation of information related to imaging is not limited to the example shown below, and may be any device capable of generating information related to imaging.
  • -GPS Global Positioning System
  • -gyro sensor an example of a device that generates information about a region to be imaged
  • acceleration sensor a device that generates information about a situation to be imaged
  • Clock An example of a device that generates information about the time that is imaged
  • the information related to imaging is not limited to being generated by a device included in the imaging device or a device connected to the imaging device.
  • an imaging device or an apparatus including the imaging device can communicate with a base station (an example of an external device of the imaging device), the base station detects the position of the imaging device that performed imaging by cell-based positioning or the like.
  • the base station can specify the time when imaging was performed by communication.
  • an imaging device or an apparatus including the imaging device can perform communication by a wireless local area network (LAN) using IEEE802.11 or the like, the imaging device or the like that performs imaging is communicated.
  • the server an example of an external device of the imaging device
  • the information regarding imaging may be data generated in an external device of the imaging device such as a base station or a server.
  • the information processing apparatus acquires or generates information related to imaging generated in the external apparatus from the external apparatus, for example.
  • Information relating to the imaging is acquired via an imaging device or an apparatus including the imaging device. Then, the information processing apparatus according to the present embodiment uses the acquired information regarding imaging as information regarding imaging corresponding to the target image.
  • the information processing apparatus reads out and reads out information relating to imaging associated with one captured image that is the target image from the recording medium.
  • Information regarding imaging may be used.
  • the method of acquiring information related to imaging corresponding to the target image in the information processing apparatus according to the present embodiment is not limited to the example shown above.
  • the information processing apparatus when reference information is stored in a storage unit (described later) or an external recording medium when information related to imaging corresponding to the target image is acquired, the information processing apparatus according to the present embodiment refers to Characters associated with the acquired information relating to imaging are extracted from the information. Then, the information processing apparatus according to the present embodiment uses a character extracted from the reference information as a character candidate.
  • the information processing apparatus searches information relating to imaging that matches the acquired information relating to imaging from reference information, and extracts characters associated with the information relating to the retrieved imaging. .
  • the information processing apparatus is included in a range of values obtained by adding or subtracting a value set to each value indicated by the acquired information regarding imaging from each value indicated by the acquired information regarding imaging.
  • Information relating to imaging that indicates a value to be acquired may be searched from reference information, and characters associated with the searched information relating to imaging may be extracted.
  • the information processing apparatus according to the present embodiment causes the external apparatus to transmit a character candidate transmission request.
  • Examples of the external device to which the information processing apparatus according to the present embodiment transmits a character candidate transmission request include a management apparatus according to the present embodiment described later.
  • the information processing apparatus for example, is a character candidate in a communication unit (described later) included in the information processing apparatus according to the present embodiment or an external communication device connected to the information processing apparatus according to the present embodiment.
  • a wired network such as a LAN or a WAN (Wide Area Network), a wireless LAN (WLAN: Wireless Local Area Network) or a wireless WAN via a base station (WWAN: Wireless Wide Network).
  • a wireless network such as Area Network) or the Internet using a communication protocol such as TCP / IP (Transmission Control Protocol / Internet Protocol) can be used.
  • the character candidate transmission request includes, for example, a command for extracting character candidates from reference information and transmitting them, and information related to imaging corresponding to the target image.
  • the external device In response to the character candidate transmission request, the external device that has received the character candidate transmission request receives information related to imaging received from the reference information, for example, as in the case where the information processing apparatus according to the present embodiment extracts characters from the reference information. The character associated with is extracted. The external device transmits the extracted character to the information processing apparatus according to the present embodiment. Then, the information processing apparatus according to the present embodiment uses the character (character extracted from the reference information) transmitted from the external device in response to the character candidate transmission request as the character candidate.
  • the information processing apparatus When the reference information includes one or more characters corresponding to the acquired information relating to imaging, the information processing apparatus according to the present embodiment performs the above-described process as the character candidate extraction control process. Alternatively, two or more character candidates are obtained. In addition, when a character corresponding to information related to imaging acquired in the reference information is not included, a character candidate is not obtained.
  • the information processing apparatus does not extract character candidates when information regarding imaging corresponding to the target image is not acquired. Therefore, when information regarding imaging corresponding to the target image is not acquired, a character candidate cannot be obtained.
  • the information processing apparatus uses the character recognized from the target image based on the character candidates extracted from the reference information in the processing (1) (character candidate extraction control processing). Selectively correct to determine characters to be recognized from the target image.
  • examples of the character recognized from the target image include a character recognized by the external device of the information processing apparatus according to the present embodiment from the captured image that is the target image.
  • the character recognized from the target image may be a character recognized from the captured image that is the target image by the information processing apparatus according to the present embodiment, for example.
  • the external device, the information processing apparatus according to the present embodiment performs processing on the target image from the target image, for example, by performing processing in the character region detection processing unit 12 and processing in the character recognition processing unit 14 illustrated in FIG. Recognize characters.
  • character recognition process the process of recognizing characters from the target image is referred to as “character recognition process”.
  • a character recognized from the target image may be referred to as a “recognized character”.
  • the information processing apparatus selectively corrects the recognized character based on the similarity between the recognized character and the character candidate.
  • the similarity according to the present embodiment is an index that quantitatively indicates how similar the recognized character and the character candidate are.
  • the similarity according to the present embodiment can be obtained, for example, by calculating the Jaro-Winkler distance or the Levenshtein distance between the recognized character and the character candidate.
  • the similarity calculation method according to the present embodiment is not limited to the above.
  • the information processing apparatus according to the present embodiment can calculate the degree of similarity using any method that can quantitatively determine how much the recognized character and the character candidate are similar. Is possible.
  • the information processing apparatus performs, for example, the following processes (2-1) to (2-4) as processes based on the degree of similarity.
  • the information processing apparatus when a plurality of character candidates are obtained in the process (1) (character candidate extraction control process), the information processing apparatus according to the present embodiment has a similarity to each of the obtained character candidates. Is calculated.
  • the information processing apparatus according to the present embodiment for example, performs the following (2-1) to (2-4) on the character candidate corresponding to the highest similarity among the obtained plurality of character candidates. The process shown in is performed.
  • the information processing apparatus determines the corrected character as a character recognized from the target image.
  • the similarity when the similarity is represented by a value of 0 ⁇ similarity ⁇ 1, when the similarity indicates 1, it indicates that the character with the recognized similarity matches the character candidate. .
  • examples of the threshold value according to the present embodiment include a preset fixed value and a variable value that can be changed by a user operation.
  • examples of the threshold when the similarity is represented by a value of 0 ⁇ similarity ⁇ 1, examples of the threshold include 0.7 and 0.8.
  • a threshold value is not restricted to the example shown above in this embodiment.
  • the information processing apparatus corrects the recognized character as a character candidate, and determines the corrected character as a character recognized from the target image.
  • the character candidate is a character extracted from reference information including characters based on characters recognized in the past from the captured image, and therefore, when the similarity indicates that the recognized character matches the character candidate. Is likely to be a character whose character has been correctly recognized from the target image. Therefore, in the above case, the information processing apparatus according to the present embodiment does not correct the recognized character and determines the recognized character as a character recognized from the target image.
  • the present embodiment relates to The information processing apparatus corrects the recognized character without using the character candidate. Then, the information processing apparatus according to the present embodiment determines the corrected character as a character recognized from the target image.
  • the case where the similarity is equal to or less than a predetermined threshold corresponds to a case where the recognized character is not a character similar to the character candidate. Therefore, in the above case, the information processing apparatus according to the present embodiment does not correct the recognized character with the character candidate.
  • the information processing apparatus according to the present embodiment corrects the recognized character by performing, for example, the same processing as the character determination processing unit 16 in FIG.
  • the information processing apparatus performs the process according to the first example shown in (2-1) to the third example shown in (2-3). It is also possible to perform a process combining two or more of the processes according to the example.
  • the information processing apparatus performs processing based on the degree of similarity from the processing according to the first example shown in (2-1) to the processing according to the fourth example shown in (2-4). By performing one of the processes, the character recognized based on the similarity is selectively corrected.
  • the information processing apparatus when a character candidate is not extracted in the process (1) (character candidate extraction control process), the information processing apparatus according to the present embodiment performs the process according to the third example shown in (2-3). As with, the recognized character is corrected without using the character candidate. Then, the information processing apparatus according to the present embodiment determines the corrected character as a character recognized from the target image.
  • the information processing apparatus includes, for example, the process (1) (character candidate extraction control process) and the process (2) (character determination process) as the processes related to the information processing method according to the present embodiment. )I do.
  • the information processing apparatus selectively corrects the character recognized based on the character candidate in the process (2) (character determination process), and detects the character recognized from the target image. decide.
  • the character candidates according to the present embodiment are characters extracted from the reference information that plays the role of collective intelligence by the process (1) (character candidate extraction control process).
  • the information processing apparatus is, for example, a case where the recognized character is recognized as an existing different character string.
  • the character recognized based on the character candidate can be corrected to determine the character recognized from the target image.
  • the information processing apparatus for example, the character included in the target image is a character related to guidance to a building or place existing at a position away from the imaged position, “XX direction”, etc.
  • the character recognized from the candidate image can be determined by correcting the character recognized based on the character candidate even if it is a character indicating a remote place name.
  • the information processing apparatus for example, even if a character is not normally recognized from the target image because a part of the character included in the target image is missing or blurred.
  • the character recognized based on the candidate can be corrected, and the character recognized from the target image can be determined.
  • the information processing apparatus can improve the recognition accuracy of characters recognized from the captured image.
  • the information processing apparatus determines characters to be recognized from the target image using collective intelligence, the recognition speed of the characters to be recognized from the captured image can be further increased.
  • the information processing apparatus can determine the character to be recognized from the target image based on the character candidates obtained after the collective intelligence to be referred to is narrowed down, for example. Thus, it is possible to determine characters to be recognized from the target image at higher speed and with higher accuracy.
  • processing according to the information processing method according to the present embodiment is not limited to the processing (1) (character candidate extraction control processing) and the processing (2) (character determination processing).
  • the information processing apparatus according to the present embodiment for example, performs one or both of the processes (3) and (4) below as processes related to the information processing method according to the present embodiment. It is also possible to do this.
  • the information processing apparatus associates the character determined in the process (2) (character determination process) with the information related to imaging corresponding to the target image, and uses it as reference information. Let me record.
  • the information processing apparatus determines the character determined in the above process (2) (character determination process). Are recorded in association with the reference information with respect to the imaging corresponding to the target image.
  • the information processing apparatus each time a character is determined in the process (2) (character determination process), the information processing apparatus according to the present embodiment refers to the character and information related to imaging corresponding to the target image as reference information. To record.
  • the information processing apparatus when the combination of the determined character and the information related to the imaging corresponding to the target image is obtained more than the set number of times (or the combination is more than the number of times) (If obtained), the determined character and information relating to imaging corresponding to the target image may be recorded in association with the reference information.
  • the reference information By controlling the recording in the reference information as described above, for example, more accurate characters can be stored in the reference information, and there is a possibility that arbitrary characters due to malicious intentions are recorded in the reference information. Can be reduced.
  • the set number of times according to the present embodiment may be a preset fixed value or a variable value that can be changed based on a user operation or the like.
  • reference information may be stored in a recording medium included in an external device capable of communicating with the information processing apparatus according to the present embodiment via a network (or directly), or an external recording medium connected to the external device. Is stored, the information processing apparatus according to the present embodiment transmits an update request for updating the reference information to the external apparatus.
  • the external device to which the information processing apparatus according to the present embodiment transmits an update request include a management apparatus according to the present embodiment, which will be described later.
  • the information processing apparatus sends an update request to a communication unit (described later) included in the information processing apparatus according to the present embodiment or an external communication device connected to the information processing apparatus according to the present embodiment. To send.
  • a communication unit described later
  • a command for associating the determined character with information related to imaging corresponding to the target image to be recorded in the reference information, the determined character, and the target And information related to imaging corresponding to the image for example, a command for associating the determined character with information related to imaging corresponding to the target image to be recorded in the reference information, the determined character, and the target And information related to imaging corresponding to the image.
  • the external device that has received the update request records the characters included in the update request and information related to imaging in the reference information.
  • processing in the external device that receives the update request is not limited to the above.
  • the external device that has received the update request obtains a combination of characters included in the update request and information related to imaging more than the set number of times (or when the combination is obtained more than the number of times)
  • the character included in the update request and the information regarding imaging may be recorded in association with the reference information.
  • the information processing apparatus recognizes characters from the target image. As described above, the information processing apparatus according to the present embodiment recognizes characters from the target image, for example, by performing processing in the character area detection processing unit 12 and processing in the character recognition processing unit 14 illustrated in FIG.
  • the information processing apparatus selects the character recognized by the character recognition process according to the present embodiment in the process (2) (character determination process).
  • the character to be recognized is determined from the target image.
  • the information processing apparatus includes, for example, (I) “process (1) (character candidate extraction control process) and process (2) as processes related to the information processing method according to the present embodiment.
  • (Character determination process) ", (II)” Process (1) (character candidate extraction control process), process (2) (character determination process), and process (3) (record control process) “ ,
  • (III) Process (1) (character candidate extraction control process), (2) (character determination process), and (4) (character recognition process)”
  • the information processing apparatus selectively corrects characters recognized based on the character candidates by the process (1) (character candidate extraction control process) and the process (2) (character determination process).
  • the character to be recognized from the target image can be determined.
  • the process (1) character candidate extraction control process
  • the process (2) character determination process
  • the process (3) recording control process
  • the process (4) character recognition
  • processing is obtained by dividing the processing related to the information processing method according to the present embodiment for convenience. Therefore, in the processing according to the information processing method according to the present embodiment, each of the processing (I) to (IV) can be regarded as one processing, or the processing (I) to ( It is also possible to regard each of the processes of IV) as two or more processes (depending on an arbitrary separation method).
  • FIG. 3 is a flowchart showing an example of processing related to the information processing method according to this embodiment.
  • the processes of steps S100 and S102 correspond to the process (4) (character recognition process).
  • the processes in steps S104 and S112 correspond to the process (1) (character candidate extraction control process).
  • the process of step S108 and the processes of steps S114 to S120 correspond to the process (2) (character determination process).
  • the information processing apparatus detects a character area from the target image (S100).
  • the information processing apparatus detects a character area from the target image by performing, for example, the same processing as the character area detection processing unit 12 illustrated in FIG.
  • the information processing apparatus acquires a target image by receiving, for example, a captured image that is mainly transmitted from the imaging device via a communication unit (described later).
  • the information processing apparatus may acquire the target image by reading out the data from a recording medium in which data indicating the captured image is stored, for example.
  • the information processing apparatus recognizes characters from the target image (S102).
  • the information processing apparatus recognizes characters from the target image, for example, by performing processing similar to that of the character recognition processing unit 14 illustrated in FIG.
  • the information processing apparatus acquires information related to imaging corresponding to the target image (S104).
  • the information processing apparatus receives, for example, information related to imaging that is mainly transmitted from the imaging device via a communication unit (described later) or the like, thereby acquiring information related to imaging corresponding to the target image. get.
  • the information processing apparatus captures an image by, for example, reading information regarding imaging corresponding to the target image from a recording medium in which data indicating the captured image and information regarding imaging are stored in association with each other. You may acquire the information about.
  • the information processing apparatus determines whether or not information related to imaging corresponding to the target image has been acquired (S106).
  • step S106 If it is not determined in step S106 that information related to imaging corresponding to the target image has been acquired, the information processing apparatus according to the present embodiment uses a word dictionary as in the character determination processing unit 16 illustrated in FIG. The recognized character is corrected (S108). Then, the information processing apparatus according to the present embodiment determines the corrected character as a character recognized from the target image.
  • the information processing apparatus obtains character information from the reference information based on the acquired information related to imaging. Extract (S110).
  • the information processing apparatus according to the present embodiment extracts character information from the reference information, for example, by searching for reference information or by causing an external device to search for reference information.
  • the information processing apparatus determines whether a character candidate has been obtained (S112).
  • the processing in step S112 has been recognized in the past, for example, in an imageable area corresponding to information related to imaging. It can be said that this is a process for determining whether or not there is a character.
  • step S112 the information processing apparatus according to the present embodiment performs the process of step S108, and determines the corrected character as a character recognized from the target image.
  • the information processing apparatus calculates the similarity between the recognized character and the character candidate (S114).
  • the information processing apparatus calculates, for example, a Jaro-Winkler distance between a recognized character and a character candidate, and sets the value obtained as a result of the calculation as a similarity.
  • the information processing apparatus determines whether or not the similarity indicates that the recognized character and the character candidate completely match (S116). For example, when the similarity is represented by a value of 0 ⁇ similarity ⁇ 1, when the similarity is “1”, the information processing apparatus according to the present embodiment completely recognizes the recognized character and the character candidate. It is determined that they match.
  • the information processing apparatus When it is determined that the recognized character and the character candidate match completely in step S116, the information processing apparatus according to the present embodiment does not correct the recognized character,
  • the recognized character is determined as a character to be recognized from the target image.
  • the information processing apparatus When it is not determined that the recognized character and the character candidate match completely in step S116, the information processing apparatus according to the present embodiment has a similarity greater than or equal to a predetermined threshold value. It is determined whether or not there is (S118). Note that the information processing apparatus according to the present embodiment may determine whether or not the similarity is greater than a predetermined threshold in step S118.
  • step S118 If it is determined in step S118 that the similarity is greater than or equal to a predetermined threshold, the information processing apparatus according to the present embodiment corrects the recognized character using the character candidate (S120). Then, the information processing apparatus according to the present embodiment determines the corrected character as a character recognized from the target image.
  • step S118 the information processing apparatus according to the present embodiment performs the process of step S108, and recognizes the corrected character from the target image. Determine as.
  • the information processing apparatus determines a character to be recognized from the target image, for example, by performing the process shown in FIG. 3 as the process related to the information processing method according to the present embodiment.
  • FIG. 4 is an explanatory diagram illustrating an example of character recognition by processing according to the information processing method according to the present embodiment.
  • 4 shows an example of the target image
  • B shown in FIG. 4 shows an example of characters recognized from the target image
  • C shown in FIG. 4 shows an example of characters recognized from the target image, which are determined by the processing related to the information processing method according to the present embodiment.
  • the recognized character is “Sakuramotocho”, and the acquired information regarding imaging shows the following.
  • Latitude: 35.450918 an example of information indicating the position
  • Longitude: 139.631073 an example of information indicating a position
  • -Direction: 115 [°] an example of additional information
  • Altitude: N / A an example of additional information
  • the information processing apparatus is, for example, within a radius X [Km] from the position based on the information indicating the position included in the information related to imaging and the additional information, and is clockwise with the north direction as the reference direction.
  • the range of 115 ⁇ t [°] ⁇ ⁇ 115 + t [°] ”(t is a threshold value related to a direction, and X is a threshold value that correlates with the resolution of the imaging device) is specified as an imageable region.
  • the information processing apparatus extracts characters corresponding to an imageable area specified based on information related to imaging from the reference information illustrated in FIG. 2A to obtain character candidates. For example, from the reference information shown in FIG. 2A, “Sakuragicho”, “Noge”, “Yokohama World Porters” and “Yokohama Museum of Art” are obtained as character candidates.
  • the information processing apparatus calculates a similarity for each obtained character candidate.
  • the information processing apparatus uses “Sakuragicho” having the highest degree of similarity to the recognized character “Sakuramotocho” and uses the recognized character “Sakuramotocho”. Is corrected to “Sakuragicho”. Then, the information processing apparatus according to the present embodiment determines “Sakuragicho”, which is a corrected character, as a character to be recognized from the target image.
  • characters recognized from a captured image are selectively corrected using so-called collective intelligence. Therefore, by using the information processing method according to the present embodiment, even if a captured image includes characters that are easily misrecognized by humans, such as “rn” (r and n) and “m”, for example.
  • the character recognized from the captured image can be corrected to a correct character.
  • the information processing apparatus performs the process (3) (recording control process) to update the reference information, and as a result, the collective intelligence is updated.
  • the reference information update method according to the present embodiment is not limited to the method in which the information processing apparatus according to the present embodiment performs the process (3) (recording control process).
  • the information processing apparatus according to the present embodiment may transmit the determined character and information regarding imaging corresponding to the target image to the external apparatus, and the external apparatus may update the reference information.
  • An example of the configuration of the external device capable of updating the reference information according to the present embodiment will be described using a management device according to the present embodiment described later as an example.
  • the characters determined by the processing related to the information processing method according to the present embodiment are described in, for example, a guide board, a sign, a sightseeing spot guide, a restaurant menu, a route map of public transportation, a map around a station, etc.
  • a character that has been made it is possible to present to the user a character in which the character described therein is translated.
  • FIG. 5 is an explanatory diagram showing a usage example of characters determined by the processing according to the information processing method according to the present embodiment.
  • FIG. 5 shows an example in which a character obtained by translating the character written on the sign is displayed on the display screen of the device owned by the user.
  • the character determined by the process related to the information processing method according to the present embodiment is the captured image metadata as the target image and the captured image. It is recorded on the recording medium in association with each other.
  • the metadata stored in the recording medium is used for image classification, image search, image filtering, and the like.
  • FIG. 6 is a block diagram illustrating an example of the configuration of the information processing apparatus 100 according to the present embodiment.
  • the information processing apparatus 100 includes, for example, a character recognition unit 102, a character candidate extraction control unit 104, and a character determination processing unit 106.
  • the information processing apparatus 100 includes, for example, a control unit (not shown), a ROM (Read Only Memory. Not shown), a RAM (Random Access Memory. Not shown), and communication for communicating with an external device.
  • a control unit not shown
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a unit not shown
  • a storage unit not shown
  • the like may be provided.
  • the control unit (not shown) is configured by, for example, a processor configured by an arithmetic circuit such as an MPU, various circuits, and the like, and controls the information processing apparatus 100 as a whole.
  • the control unit may serve one or more of the character recognition unit 102, the character candidate extraction control unit 104, and the character determination processing unit 106.
  • the character recognition unit 102, the character candidate extraction control unit 104, and the character determination processing unit 106 may be configured by a dedicated (or general-purpose) circuit capable of realizing the processing of each unit. Needless to say, that's good.
  • ROM (not shown) stores control data such as programs and calculation parameters used by a control unit (not shown).
  • a RAM (not shown) temporarily stores programs executed by a control unit (not shown).
  • the communication unit is a communication unit included in the information processing apparatus 100 and serves to perform communication with an external device wirelessly or by wire via a network (or directly).
  • a communication unit for example, a communication antenna and an RF (Radio Frequency) circuit (wireless communication), an IEEE 802.15.1 port and a transmission / reception circuit (wireless communication), an IEEE 802.11 port, and a transmission / reception Examples include a circuit (wireless communication), a LAN (Local Area Network) terminal, and a transmission / reception circuit (wired communication).
  • the storage unit (not shown) is a storage unit included in the information processing apparatus 100, and stores various data such as applications.
  • data relating to processing related to the information processing method according to the present embodiment such as image data indicating a captured image, information related to imaging corresponding to the image data, reference information, and the like. May be stored.
  • examples of the storage unit include a magnetic recording medium such as a hard disk, a non-volatile memory such as a flash memory, and the like. Further, the storage unit (not shown) may be detachable from the information processing apparatus 100.
  • the character recognizing unit 102 plays a role of performing the process (4) (character recognition process) mainly, and recognizes a character from a captured image that is a target image.
  • the character recognition unit 102 includes, for example, a character region detection processing unit 108 having the same function as the character region detection processing unit 12 shown in FIG. 1 and a character recognition processing unit 110 having the same function as the character recognition processing unit 14. In the same manner as the image processing apparatus 10 shown in FIG. 1, characters are recognized from the captured image.
  • the character candidate extraction control unit 104 plays a role of performing the process (1) (character candidate extraction control process) mainly, and from reference information to information related to imaging based on information related to imaging corresponding to the target image.
  • the corresponding character is extracted as a character candidate.
  • FIG. 6 shows an example in which information related to imaging includes information related to position and additional information.
  • the character candidate extraction control unit 104 does not extract the character candidates when the information related to the imaging corresponding to the target image is not acquired.
  • the character determination processing unit 106 plays a role of performing the process (2) (character determination process) mainly, and the character recognized from the target image based on the character candidates obtained by the character candidate extraction control unit 104. Are selectively corrected to determine characters to be recognized from the target image.
  • the character determination processing unit 106 performs, for example, any one of the processing according to the first example shown in (2-1) to the processing according to the fourth example shown in (2-4) above. The character to be recognized from the target image is determined.
  • the character determination processing unit 106 includes a first character determination processing unit 112 and a second character determination processing unit 114, and performs the processing according to the fourth example shown in (2-4) above.
  • An example is shown.
  • the first character determination processing unit 112 performs, for example, the process of step S106 in FIG. 3 and the processes of steps S110 to S120.
  • the second character determination processing unit 114 performs, for example, the process of step S108 in FIG.
  • the information processing apparatus 100 has, for example, the configuration shown in FIG. 6, the processing (for example, the processing (1) (character candidate extraction control processing) described above and the processing (2) (character processing) according to the present embodiment. Determination process) and the process (4) (character recognition process)).
  • the information processing apparatus 100 can improve the recognition accuracy of characters recognized from the captured image, for example, with the configuration shown in FIG.
  • the information processing apparatus 100 can exhibit the effects exhibited by performing the processing related to the information processing method according to the present embodiment as described above, for example.
  • the information processing apparatus can be configured without the character recognition unit 102 shown in FIG.
  • the information processing apparatus uses the character candidate obtained in the process (1) (character candidate extraction control process) and the external apparatus.
  • the process (2) (character determination process) can be performed using a character recognized from a captured image that is a target image. Therefore, even when the configuration without the character recognition unit 102 is adopted, the information processing apparatus according to the present embodiment can improve the recognition accuracy of characters recognized from the captured image.
  • the information processing apparatus may further include a recording control unit (not shown) that performs the function (3) (recording control process).
  • a recording control unit (not shown)
  • so-called collective intelligence can be updated by updating the reference information. Therefore, in the case of further including a recording control unit (not shown), since the possibility that the character recognized by the character candidate can be corrected is increased, the recognition of the character recognized from the captured image is increased. The accuracy can be improved.
  • the information processing apparatus may further include a processing unit that performs each process related to the usage example of the character determined by the process according to the information processing method according to the above-described embodiment.
  • the process (1) character candidate extraction control process
  • the process (2) character determination process
  • the process (3) record control process
  • the process (4) character recognition process
  • the configuration for realizing the processing related to the information processing method according to the present embodiment is not limited to the configuration shown in FIG. 12 or the configuration according to the above modification, and the processing related to the information processing method according to the present embodiment is not limited. It is possible to take a configuration according to the way of carving.
  • the reference information according to the present embodiment includes characters determined by the processing according to the information processing method according to the present embodiment, It can be updated with information relating to imaging corresponding to the target image.
  • the reference information it is more likely that the character recognized by the character candidate can be corrected, so that the recognition accuracy of the character recognized from the captured image is improved. Can do.
  • the process related to the update of the reference information may be performed in the external device of the information processing apparatus according to the present embodiment. Good.
  • a management apparatus according to the present embodiment will be described as an apparatus capable of updating reference information.
  • FIG. 7 is a block diagram illustrating an example of the configuration of the management apparatus 200 according to the present embodiment. 7 further shows information processing apparatuses 100 connected to the management apparatus 200 via the network 300. That is, FIG. 7 illustrates an information processing system 1000 in which the management apparatus 200 and the information processing apparatuses 100,... Are connected via the network 300.
  • “one device and another device are connected” according to the present embodiment means, for example, a state in which one device and another device can communicate with each other.
  • the management apparatus 200 includes a communication unit 202, a storage unit 204, and a control unit 206, for example.
  • the management device 200 displays, for example, a ROM (not shown), a RAM (not shown), a storage unit (not shown), an operation unit (not shown) that can be operated by the user, and various screens. You may provide the display part (not shown) etc. which are displayed on a screen.
  • the management apparatus 200 connects the above-described components by a bus as a data transmission path.
  • a ROM (not shown) stores control data such as programs and calculation parameters used by the control unit 206.
  • a RAM (not shown) temporarily stores programs executed by the control unit 206.
  • an operation input device to be described later can be cited.
  • a display part (not shown), the display device mentioned later is mentioned.
  • FIG. 8 is an explanatory diagram illustrating an example of a hardware configuration of the management apparatus 200 according to the present embodiment.
  • the management apparatus 200 includes, for example, an MPU 250, a ROM 252, a RAM 254, a recording medium 256, an input / output interface 258, an operation input device 260, a display device 262, and a communication interface 264.
  • the management device 200 connects each component with a bus 266 as a data transmission path, for example.
  • the MPU 250 includes, for example, one or two or more processors configured by an arithmetic circuit such as an MPU, various processing circuits, and the like, and functions as the control unit 206 that controls the entire management apparatus 200. Further, the MPU 250 plays a role of, for example, a processing unit 210 described later in the management apparatus 200.
  • the ROM 252 stores programs used by the MPU 250 and control data such as calculation parameters.
  • the RAM 254 temporarily stores a program executed by the MPU 250, for example.
  • the recording medium 256 functions as the storage unit 204, and stores various data such as data related to the information processing method according to the present embodiment such as reference information and various applications.
  • examples of the recording medium 256 include a magnetic recording medium such as a hard disk and a non-volatile memory such as a flash memory.
  • the recording medium 256 may be detachable from the management apparatus 200.
  • the input / output interface 258 connects the operation input device 260 and the display device 262, for example.
  • the operation input device 260 functions as an operation unit (not shown), and the display device 262 functions as a display unit (not shown).
  • examples of the input / output interface 258 include a USB (Universal Serial Bus) terminal, a DVI (Digital Visual Interface) terminal, an HDMI (High-Definition Multimedia Interface) (registered trademark) terminal, and various processing circuits. .
  • the operation input device 260 is provided on the management apparatus 200, for example, and is connected to the input / output interface 258 inside the management apparatus 200.
  • Examples of the operation input device 260 include buttons, direction keys, a rotary selector such as a jog dial, or a combination thereof.
  • the display device 262 is provided on the management apparatus 200, for example, and is connected to the input / output interface 258 inside the management apparatus 200.
  • Examples of the display device 262 include a liquid crystal display (Liquid Crystal Display), an organic EL display (Organic Electro-Luminescence Display, or an OLED display (Organic Light Emitting Diode Display)), and the like.
  • the input / output interface 258 can be connected to an external device such as an external operation input device (for example, a keyboard or a mouse) as an external device of the management apparatus 200 or an external display device.
  • an external operation input device for example, a keyboard or a mouse
  • the display device 262 may be a device capable of display and user operation, such as a touch screen.
  • the communication interface 264 is a communication unit included in the management apparatus 200, and functions as a communication unit 202 for performing wireless or wired communication with an external apparatus such as the information processing apparatus 100 via a network (or directly). To do.
  • examples of the communication interface 264 include a communication antenna and an RF circuit, an IEEE 802.15.1 port and a transmission / reception circuit, an IEEE 802.11 port and a transmission / reception circuit, or a LAN terminal and a transmission / reception circuit.
  • the management apparatus 200 performs a process of updating the reference information with the configuration shown in FIG. Note that the hardware configuration of the management apparatus 200 according to the present embodiment is not limited to the configuration illustrated in FIG.
  • the management apparatus 200 may not include the communication interface 264 when communicating with an external apparatus such as the information processing apparatus 100 via the connected external communication device. Further, the management apparatus 200 can be configured not to include the recording medium 256, the operation input device 260, and the display device 262.
  • the communication unit 202 is a communication unit included in the management apparatus 200 and communicates with an external apparatus such as the information processing apparatus 100 wirelessly or by wire via a network (or directly). Further, the communication of the communication unit 202 is controlled by the control unit 206, for example.
  • examples of the communication unit 202 include a communication antenna and an RF circuit, a LAN terminal, and a transmission / reception circuit, but the configuration of the communication unit 202 is not limited to the above.
  • the communication unit 202 can have a configuration corresponding to an arbitrary standard capable of performing communication, such as a USB terminal and a transmission / reception circuit, or an arbitrary configuration capable of communicating with an external device via a network.
  • the storage unit 204 is a storage unit included in the management apparatus 200, and stores various data such as data related to the information processing method according to the present embodiment such as reference information and various applications.
  • FIG. 7 shows an example in which the reference information 220 is stored in the storage unit 204.
  • examples of the storage unit 204 include a magnetic recording medium such as a hard disk and a nonvolatile memory such as a flash memory.
  • the storage unit 204 may be detachable from the management device 200.
  • the control unit 206 is configured by, for example, an MPU and plays a role of controlling the entire management apparatus 200.
  • the control unit 206 includes a processing unit 210, for example.
  • the processing unit 210 performs a process of updating the reference information. For example, when the character recognized from the captured image and the information regarding the imaging corresponding to the captured image are acquired, the processing unit 210 records the acquired character and the information regarding the imaging in association with the reference information. To do.
  • the processing unit 210 Characters and information related to imaging corresponding to the target image are associated and recorded in the reference information.
  • the processing unit 210 obtains a combination of the acquired character and information related to imaging corresponding to the target image more than the set number of times (or when the combination is obtained more than the number of times).
  • the acquired characters and information regarding imaging corresponding to the target image may be recorded in association with the reference information.
  • the processing unit 210 can store, for example, more accurate characters in the reference information, and arbitrary characters due to malicious intent are recorded in the reference information. The possibility of being reduced can be reduced.
  • the processing unit 210 updates the reference information by performing the processing as described above, for example.
  • the process which the process part 210 performs is not restricted to the process which updates reference information.
  • the processing unit 210 may perform a process of extracting characters that are character candidates from the reference information.
  • the processing unit 210 searches the reference information for characters corresponding to information related to imaging included in the character candidate transmission request. Thus, characters that are character candidates are extracted from the reference information. Then, the processing unit 210 causes the communication unit 202 to transmit the extracted character candidates to the information processing apparatus 100 that has transmitted the character candidate transmission request.
  • the management apparatus 200 updates the reference information with the configuration illustrated in FIG.
  • the management apparatus can include the processing unit 210 illustrated in FIG. 7 separately from the control unit 206 (for example, realized by another processing circuit).
  • the management apparatus when the management apparatus according to the present embodiment has a function of performing a process of extracting characters that are character candidates from the reference information, the management apparatus according to the present embodiment, for example, an extraction processing unit that performs a process of extracting characters that are character candidates from the reference information ( (Not shown) may be realized by a processing circuit separate from the processing unit 210.
  • the management apparatus When communicating with an external device such as the information processing apparatus 100 via an external communication device having the same function and configuration as the communication unit 202, the management apparatus according to the present embodiment uses the communication unit 202. It is also possible to take a configuration not provided.
  • the present embodiment includes, for example, communication devices such as mobile phones and smartphones, tablet devices, computers such as PCs (Personal Computers) and servers, display devices, video / music playback devices (or video / music recording / playback devices), It can be applied to various devices such as game machines.
  • the present embodiment can be applied to, for example, a processing IC (Integrated Circuit) that can be incorporated in the above devices.
  • the information processing apparatus may be realized by a system including a plurality of apparatuses based on connection to a network (or communication between apparatuses) such as cloud computing. . That is, the information processing apparatus according to the present embodiment described above can be realized as a system including a plurality of apparatuses, for example.
  • the present embodiment is not limited to such a form.
  • the present embodiment can be applied to various devices that can update reference information, such as computers such as PCs and servers.
  • the present embodiment can be applied to a processing IC that can be incorporated in the above-described device, for example.
  • Program for causing a computer to function as the information processing apparatus according to the present embodiment A program for causing a computer to function as the information processing apparatus according to the present embodiment (for example, the process (I) and the process (II) described above)
  • the program that can execute the processing according to the information processing method according to the present embodiment, such as the processing of (III) and the processing of (IV), is executed by a processor or the like in a computer.
  • the recognition accuracy of characters recognized from images can be improved.
  • an effect produced by the processing related to the information processing method according to the above-described embodiment by executing a program for causing the computer to function as the information processing apparatus according to the present embodiment by a processor or the like in the computer. Can be played.
  • [II] Program for causing a computer to function as a management apparatus according to the present embodiment A program for causing a computer to function as the management apparatus according to the present embodiment (for example, “a process for updating the reference information” or “the reference information described above”)
  • the reference information can be updated by executing a process for updating a character and a process for extracting a character candidate from the reference information by a processor or the like in the computer.
  • an information processing system capable of improving the recognition accuracy of characters recognized from captured images can be realized by a program for causing a computer to function as a management apparatus according to the present embodiment.
  • a program for causing a computer to function as the information processing apparatus according to the present embodiment or the management apparatus according to the present embodiment is provided. Furthermore, it is possible to provide a recording medium in which the program is stored, or a recording medium in which the program is stored together.
  • the following configurations also belong to the technical scope of the present disclosure.
  • a character candidate extraction control unit that extracts a character corresponding to information related to the imaging as a character candidate that is a candidate for a character to be recognized from the target image;
  • a character determination processing unit that selectively corrects characters recognized from the target image based on the extracted character candidates and determines characters to be recognized from the target image;
  • the character stored in the reference information includes a character based on a character recognized from a captured image in the past.
  • the character determination processing unit If the similarity indicates that the recognized character matches the character candidate, do not correct the recognized character, The information processing apparatus according to (2) or (3), wherein the recognized character is determined as a character recognized from the target image. (5) The character determination processing unit If the similarity is below a predetermined threshold, or if the similarity is less than the threshold, correct the recognized character without using the character candidate, The information processing apparatus according to any one of (2) to (4), wherein the corrected character is determined as a character to be recognized from the target image. (6) The character determination processing unit corrects the recognized character without using the character candidate when the character candidate is not extracted, The information processing apparatus according to any one of (1) to (5), wherein the corrected character is determined as a character recognized from the target image.
  • the information processing apparatus includes one or more of information relating to an imaging device, information relating to an area to be imaged, information relating to an imaging situation, and information relating to an imaging time. 7) The information processing apparatus according to any one of 7). (9) The information according to any one of (1) to (8), further including a recording control unit that records the determined character in the reference information in association with the information regarding the imaging corresponding to the target image. Processing equipment.
  • the character determination processing unit selectively corrects a character recognized by the character recognition unit to determine a character to be recognized from the target image, according to any one of (1) to (9).
  • Information processing device (11) When the character recognized from the captured image and the information regarding the imaging including the information indicating the captured position corresponding to the captured image are acquired, the acquired character is associated with the information regarding the imaging
  • a management device comprising a processing unit for recording the information.
  • From the reference information stored in association with the information related to the imaging and the character based on the information related to the imaging including the information indicating the captured position corresponding to the target image that is the captured image of the target for which the character is recognized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Character Discrimination (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations comprenant : un contrôleur d'extraction de caractères candidats qui, d'après les informations correspondant à une image cible, qui est une image capturée d'une cible pour laquelle des caractères doivent être reconnus, et qui concernent une imagerie comprenant des informations indiquant une position d'image, extrait en tant que caractère candidat, c'est-à-dire un candidat pour un caractère devant être reconnu à partir de l'image cible, un caractère correspondant aux informations relatives à l'imagerie, une telle extraction étant réalisée à partir des informations de référence enregistrées après l'association des informations relatives à l'imagerie et des caractères ; et une unité de traitement de détermination de caractères qui, d'après un caractère candidat extrait, corrige de manière sélective un caractère reconnu à partir de l'image cible et détermine le caractère devant être reconnu à partir de l'image cible. Les caractères enregistrés dans les informations de référence comprennent des caractères basés sur les caractères reconnus à partir d'images capturées précédemment.
PCT/JP2015/057462 2014-05-20 2015-03-13 Dispositif de traitement d'informations, dispositif de gestion, procédé de traitement d'informations, et programme WO2015178073A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014104402A JP2015219821A (ja) 2014-05-20 2014-05-20 情報処理装置、管理装置、情報処理方法、およびプログラム
JP2014-104402 2014-05-20

Publications (1)

Publication Number Publication Date
WO2015178073A1 true WO2015178073A1 (fr) 2015-11-26

Family

ID=54553750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/057462 WO2015178073A1 (fr) 2014-05-20 2015-03-13 Dispositif de traitement d'informations, dispositif de gestion, procédé de traitement d'informations, et programme

Country Status (2)

Country Link
JP (1) JP2015219821A (fr)
WO (1) WO2015178073A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01311390A (ja) * 1988-06-10 1989-12-15 Toshiba Corp 文字置換制御方式
JP2000348142A (ja) * 1999-06-08 2000-12-15 Nippon Telegr & Teleph Corp <Ntt> 文字認識装置,文字認識方法,および文字認識方法を実行するプログラムを記録した記録媒体
WO2005066882A1 (fr) * 2004-01-08 2005-07-21 Nec Corporation Dispositif de reconnaissance de caracteres, systeme de communications mobiles, dispositif terminal mobile, dispositif de station fixe, procede de reconnaissance de caracteres, et programme de reconnaissance de caracteres
JP2014063300A (ja) * 2012-09-20 2014-04-10 Casio Comput Co Ltd 文字認識装置及び文字認識処理方法並びにプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01311390A (ja) * 1988-06-10 1989-12-15 Toshiba Corp 文字置換制御方式
JP2000348142A (ja) * 1999-06-08 2000-12-15 Nippon Telegr & Teleph Corp <Ntt> 文字認識装置,文字認識方法,および文字認識方法を実行するプログラムを記録した記録媒体
WO2005066882A1 (fr) * 2004-01-08 2005-07-21 Nec Corporation Dispositif de reconnaissance de caracteres, systeme de communications mobiles, dispositif terminal mobile, dispositif de station fixe, procede de reconnaissance de caracteres, et programme de reconnaissance de caracteres
JP2014063300A (ja) * 2012-09-20 2014-04-10 Casio Comput Co Ltd 文字認識装置及び文字認識処理方法並びにプログラム

Also Published As

Publication number Publication date
JP2015219821A (ja) 2015-12-07

Similar Documents

Publication Publication Date Title
US9602728B2 (en) Image capturing parameter adjustment in preview mode
CN105318881B (zh) 地图导航方法、装置及系统
US10677596B2 (en) Image processing device, image processing method, and program
CN105517679B (zh) 用户地理位置的确定
JP4591353B2 (ja) 文字認識装置、移動通信システム、移動端末装置、固定局装置、文字認識方法および文字認識プログラム
JP6614335B2 (ja) 画像表示システム、端末、方法およびプログラム
US10606824B1 (en) Update service in a distributed environment
KR101790655B1 (ko) 버스 정보 조회, 피드백 방법, 모바일 단말 및 서버
US11373410B2 (en) Method, apparatus, and storage medium for obtaining object information
KR20110126180A (ko) 로컬 맵들 및 위치-측정 주석추가된 데이터를 제공하기 위한 인간-보조 기술
US20150186426A1 (en) Searching information using smart glasses
CN107766403B (zh) 一种相册处理方法、移动终端以及计算机可读存储介质
WO2019105457A1 (fr) Procédé de traitement d&#39;image, dispositif informatique et support d&#39;informations lisible par ordinateur
TWI749532B (zh) 一種定位方法及定位裝置、電子設備和電腦可讀儲存媒介
CN111832579B (zh) 地图兴趣点数据处理方法、装置、电子设备以及可读介质
US20130339271A1 (en) Evaluation system, evaluation method, and storage medium
WO2015178073A1 (fr) Dispositif de traitement d&#39;informations, dispositif de gestion, procédé de traitement d&#39;informations, et programme
US10922838B2 (en) Image display system, terminal, method, and program for displaying image associated with position and orientation
JP6115673B2 (ja) 装置、及びプログラム
CN102968611A (zh) 信息处理器和信息处理方法
Vo et al. WhereAmI: Energy efficient positioning using partial textual signatures
US10952023B2 (en) Information processing apparatus and non-transitory computer readable medium
JP5920448B2 (ja) 撮像装置、プログラム
KR20170123846A (ko) 영상 기반 실내 측위 장치 및 방법
JP5655916B2 (ja) 画像検索システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15795852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15795852

Country of ref document: EP

Kind code of ref document: A1