CN115131799A - Character recognition device, character recognition method, and recording medium - Google Patents

Character recognition device, character recognition method, and recording medium Download PDF

Info

Publication number
CN115131799A
CN115131799A CN202210249269.8A CN202210249269A CN115131799A CN 115131799 A CN115131799 A CN 115131799A CN 202210249269 A CN202210249269 A CN 202210249269A CN 115131799 A CN115131799 A CN 115131799A
Authority
CN
China
Prior art keywords
character
input
unit
user
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210249269.8A
Other languages
Chinese (zh)
Inventor
小泽健夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN115131799A publication Critical patent/CN115131799A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Character Discrimination (AREA)

Abstract

The invention relates to a character recognition device, a character recognition method and a recording medium. The character recognition device includes: a touch panel display unit (17) that accepts character images based on input operations in a handwriting form; and a control unit (21) that performs recognition processing on the received character image to derive a1 st character from the character image, and causes the touch panel display unit (17) to display the derived 1 st character as a candidate of a character desired by the user, wherein the touch panel display unit (17) is capable of receiving a designation operation indicating that the 1 st character is not a character desired by the user but is a character similar to the character desired by the user, and wherein the control unit (21) further derives a 2 nd character having a predetermined relationship with the 1 st character when receiving the designation operation, and causes the derived 2 nd character to be displayed on the touch panel display unit (17) as an additional candidate of the character desired by the user.

Description

Character recognition device, character recognition method, and recording medium
Technical Field
The invention relates to a character recognition device, a character recognition method and a recording medium.
Background
The following techniques are proposed: in a series of character strings recognized by handwriting input, it is not necessary to sequentially display recognition candidate characters for each 1 character to be corrected and perform a selection operation, and the character strings are easily corrected to desired recognition candidate characters. (e.g., Japanese patent laid-open No. 2008-299431)
In the case where there is no desired character among a plurality of characters to be corrected, including the technique described in japanese patent application laid-open No. 2008-299431, it is necessary to perform character recognition processing by handwriting again.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide a character recognition device, a character recognition method, and a recording medium capable of receiving selection of a desired character from a wide range of candidates while effectively utilizing contents of handwriting input.
An aspect of the present invention is characterized by including: an input unit that accepts a character image based on an input operation in a handwriting form; a character derivation unit that performs recognition processing on the character image received by the input unit and derives a1 st character from the character image; and a display control unit that causes the 1 st character derived by the character deriving unit to be displayed as a candidate of a character desired by a user, wherein the input unit is capable of accepting a designation operation indicating that the 1 st character is not a character desired by the user but is a character similar to the character desired by the user, the character deriving unit further derives a 2 nd character having a predetermined relationship with the 1 st character when the designation operation is accepted by the input unit, and the display control unit causes the 2 nd character derived by the character deriving unit to be displayed as an additional candidate of the character desired by the user.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, selection of a desired character can be accepted from a wide range of candidates by effectively utilizing the contents of handwriting input.
Drawings
Fig. 1 is a diagram showing an external appearance structure of an electronic dictionary according to an embodiment of the present invention.
Fig. 2 is a block diagram showing a functional configuration of an electronic circuit of the electronic dictionary according to the embodiment.
Fig. 3 is a flowchart showing the processing contents of the character input by handwriting according to this embodiment.
Fig. 4 is a flowchart showing a subroutine of detailed processing contents of the stroke data correction processing of fig. 3 according to the embodiment.
Fig. 5 is a diagram illustrating a screen of handwritten character input and recognition results thereof according to this embodiment.
Fig. 6 is a diagram showing a procedure of deriving a plurality of corrected stroke data according to this embodiment.
Fig. 7 is a diagram showing a transition of a screen of the touch panel display unit according to the embodiment.
Fig. 8 is a diagram showing a transition of a screen of the touch panel display unit according to the embodiment.
Fig. 9 is a diagram illustrating a part of the radical table stored in various table storage areas according to the embodiment.
Fig. 10 is a diagram illustrating a process of deriving a character candidate of a radical type from stroke data input by a user according to this embodiment.
Fig. 11 is a diagram showing an example of components for replacing stroke data input by a user with the closest character shape according to the embodiment.
Detailed Description
An embodiment of a case where the present invention is applied to an electronic dictionary will be described below with reference to the drawings.
[ Structure ]
Fig. 1 is a front view showing an external configuration of an electronic dictionary 10 according to this embodiment. The present embodiment can be configured not only as the electronic dictionary 10 described below, but also as a tablet PC (Personal Computer) having a dictionary function, a smart phone, an electronic book, a portable game machine, a server on a communication network, and the like.
The electronic dictionary 10 is configured to include a foldable case in which a main body case 11 and a cover case 12 can be opened and closed via a hinge portion 13. A main body case 11, which is a folded case, has a surface provided with key input units 14 (hardware keys), a sound output unit (including a speaker) 15, and a sound input unit (including a microphone) 16, which include a [ main menu ] key 14a, a function designation key 14b, a character input key 14c, a [ decision ] key 14d, a [ return ] key 14e, a [ BOX ] key 14f, a cursor movement key 14g, and a [ switch ] key 14 h.
Further, a touch panel display unit 17 is provided on the surface of the cover case 12. The touch panel display unit 17 is a structure in which a touch position detection device for detecting a position touched by a pen or a finger of the user of the electronic dictionary 10 and a display device are integrated, and is configured by laminating a transparent touch panel on a liquid crystal screen with a backlight. That is, the touch panel display unit 17 functions as an input unit that receives a character image based on an input operation in a handwriting form.
The [ main menu ] key 14a of the key input section 14 is a key for causing the key input section 14 to display a main menu screen. Although not particularly shown, a plurality of icons registered in accordance with the initial setting or user operation of the electronic dictionary 10 are displayed. Each icon is a drawing or a symbol for realizing a function corresponding to the mark of the icon. For example, the group icons include application icons for directly activating functions (applications) using dictionary contents and learning contents, and group icons for displaying a list screen of icons of a plurality of functions belonging to 1 category.
The function specifying key 14b of the key input unit 14 is a key for directly specifying dictionary contents or the like marked on each key. The function designation key 14b includes: a [ multi-dictionary ] key for specifying the category of the contents of the dictionary, keys for specifying the category (a [ japanese ] key, a [ ancient ] key, a [ chinese-japanese ] key, an [ english-japanese ] key, etc.), a [ contents list ] key for displaying a list of the contents of the dictionary, and a [ learning book ] key for 1 category as a tool.
Further, when the [ change ] key 14h is operated next, the key of the key input unit 14 functions not as a key having a frame mark on the key top but as a key having a frame mark. For example, when the [ delete ] key is operated after the operation of the [ change ] key 14h, the [ delete ] key functions not as the [ delete ] key but as the [ set ] key.
Fig. 1 shows a state in which [ japanese dictionary ] is selected and a kana input label (あいう) or an alphabet input label (ABC) is selected from among kana input labels (あいう) and kana input labels (ABC) by the search string input unit 41 located above the touch panel display unit 17.
The character string [ daylight ] has been input at the search string input section 41. The candidate display unit 42 below the search string input unit 41 displays a list of, for example, 3 entries corresponding to the string [ daylight ]. Further, the handwritten character input unit 43, which displays 2 characters below the candidate display unit 42, is shown in a blank state in order to wait for the handwritten input of characters following the character string [ daylight ].
FIG. 2 is a block diagram showing the functional configuration of the electronic circuits of the electronic dictionary 10.
The electronic circuit of the electronic dictionary 10 includes a control Unit (CPU) 21 as a computer. The control unit 21 controls the operations of the circuit units in accordance with a control program stored in advance in the storage unit 22 such as a flash ROM. The number of processors such as CPU may be 1 or more.
At least one of the control program and dictionary data described later may be read from an external recording medium 23 such as a memory card by a recording medium reading unit 24 and stored in the storage unit 22, or may be stored in the storage unit 22 as a portion downloaded from a Web server or the like on an external network not shown via a communication unit 25.
The control unit 21 is connected to the key input unit 14, the audio output unit 15, the audio input unit 16, and the touch panel display unit 17, in addition to the storage unit 22, the recording medium reading unit 24, and the communication unit 25, via the data bus and the control bus.
The storage unit 22 stores a system program that is responsible for the overall operation of the electronic dictionary 10, and a communication program for communication connection with an external device via the communication unit 25, and also secures storage areas for storing programs and data for executing various functions by the electronic dictionary 10, such as a search processing program storage area 22a, a dictionary data storage area 22b, a search history storage area 22c, and a work data storage area 22 d.
The search processing program storage area 22a is an area for storing a control program for searching and displaying a word desired by a user and explanatory information such as a translated word, a word meaning, a case, and a comment corresponding to the word, based on various dictionary contents (english-japanese dictionary/english and dictionary/japanese-english dictionary/english-english dictionary/japanese-japanese dictionary/encyclopedia/. once.) stored as dictionary data in the dictionary data storage area 22b, and various table storage areas 22a1 and the like used when a search is performed in accordance with an input other than a character input of the word.
The various table storage areas 22a1 store a plurality of tables including radical tables that predetermine how each character data of a kanji character can be divided into components.
When the entry and its explanatory information are displayed as a search result in accordance with a search in accordance with a user operation, the search history storage area 22c stores the entry to be searched and information on the number of searches thereof together as a search history.
In the job data storage area 22d, data input in accordance with a user operation, various data acquired or generated by the control unit 21, and the like are temporarily stored (held) as necessary in accordance with control of operations of each part of the circuit in accordance with the control program executed by the control unit 21.
The communication unit 25 is wirelessly connected to an external device such as a similar electronic dictionary existing in the periphery, for example, based on BLE (Bluetooth (registered trademark) Low Energy) standard which is one of wireless PAN (Personal Area Network) technologies, and is also wirelessly connected to a Web server or the like on an external Network, not shown, via a wireless router or the like, to download a control program or the like.
[ actions ]
Next, the operation of the present embodiment will be described.
In the present embodiment, the operation of the correction processing in the case where the result of character image recognition of a handwritten input is not a kanji intended by the user at the time of function selection of the japanese dictionary will be described.
Here, for example, the following case is explained: a character image of a character string (daylight) is input as a search character string by handwriting input and recognized as desired, and then a character image of a character (bath) is input by handwriting input in the same manner, and the desired character (bath) is not present in character candidates of the recognition result of the input character image.
Fig. 5 (a) is a screen in which a character image of a character [ bath ] is input by the user of the electronic dictionary 10 by writing with a stylus pen P to the handwritten character input unit 43 from the display state of the touch panel display unit 17 shown in fig. 1.
As shown in the drawing, the handwritten character input unit 43 has an input area of 2 characters. On the other hand, for example, after a left-hand handwriting input, execution of a recognition process for the input character is instructed on the right side using the handwriting pen P.
In response to an instruction to execute the recognition process using the stylus pen P, the control unit 21 executes the character recognition process and displays the character candidate list as the recognition result in descending order of the degree of similarity evaluated. That is, the control unit 21 functions as a character derivation unit that performs recognition processing on the character image received by the input unit to derive the 1 st character from the character image, and a display control unit that causes the 1 st character derived by the character derivation unit to be displayed as a candidate of a character desired by the user.
Fig. 5 (B) shows a state in which the candidate character display unit 44 displays a list of kanji candidates evaluated to have a high degree of similarity. Since the candidate character display unit 44 does not contain the desired kanji (chinese character bath), the user needs to perform an operation for correcting the character inputted by handwriting.
Fig. 3 is a flowchart showing the content of the character processing for the handwriting input executed by the control unit 21.
At the beginning of the process, the control unit 21 acquires information on candidate characters that match the recognition result (step S101), and displays a list of these candidate characters on the candidate character display unit 44 as shown in fig. 5B.
From the state where the kanji candidate is list-displayed, the control unit 21 waits for input of any one of the candidate that is considered to be closest (step S103), the long press operation, the selection of one candidate (step S107), and the cancellation of the list display (step S109) by repeating the determination.
If it is determined in step S107 that the operation to cancel the list display has been performed (yes in step S109), the control unit 21 ends the processing of fig. 3 here.
Fig. 7 (a) shows a state where the user has long-pressed the kanji candidate [ touch ] in the handwritten character input unit 43 with the stylus pen P from the screen of the touch panel display unit 17 shown in fig. 5 (B).
When it is determined in step S103 that the long press operation is performed on 1 candidate, the control unit 21 executes a stroke data correction process to derive a plurality of corrected stroke data from the original input stroke data of the handwriting input and the character shapes of the candidate kanji selected as the candidate having the highest similarity (step S104).
Fig. 6 is a diagram illustrating a process of deriving a plurality of corrected stroke data. The stroke data of the handwriting input character shown in fig. 6 (a) and the shape of the character [ consistent ] selected by the user as determined to be the closest are combined as shown in fig. 6 (B), and the control unit 21 derives the correction stroke data 1 shown in fig. 6 (C) and the correction stroke data 2 shown in fig. 6 (D).
Fig. 4 is a flowchart showing a detailed subroutine of the stroke data correction process executed by the control unit 21 in step S104. First, input stroke data of a user is scanned to derive glyph analysis data including outline boxes, blank information, and dense information (step S201).
The control unit 21 refers to the radical tables stored in the various table storage areas 22a1 based on the derived font analysis data, and derives radical candidates for the stroke data input by the user (step S202).
Fig. 9 is a diagram illustrating a part of the radical type table stored in the various table storage areas 22a 1. As shown in the figure, the radical structure of a chinese character can be classified into a structure including "partial" and "side" as shown in fig. 9 (a), a structure including "crown" and "foot" as shown in fig. 9 (B), a structure including "wrap" as shown in fig. 9 (C), a structure including "droop" as shown in fig. 9 (D), a structure including another "structure" as shown in fig. 9 (E), and another structure not divided into radicals as shown in fig. 9 (F).
Fig. 10 is a diagram for explaining a process of deriving a character candidate of a radical type from stroke data input by a user. As shown in fig. 10 (a), the stroke data input by the user shown in fig. 5 (a) and fig. 6 (a) are sequentially scanned in one direction in the vertical direction and the horizontal direction, respectively.
Fig. 10 (B) shows the result OF deriving the outline box OF, the blank information SI, and the dense information CI based on the density stroke data OF the character image from the scanning result. From these derived results, as shown in fig. 10 (C), the result of 2 radicals including the radical Rix1 and the radical Rix2 shown in fig. 9 (a) is recognized, and radical type candidates having a high degree of similarity are derived in consideration of the kanji candidate of the long stroke operation.
In fig. 10 (C), the stroke data recognized as input by the user is a structure composed of "part" and "side" as shown in fig. 9 (a), while in fig. 7 (a), the kanji candidate [ part ] handwritten-input by the user in the handwritten character input unit 43 with the stylus P is also a structure composed of "part" and "side" as shown in fig. 9 (a). As described above, when the character image determined from the handwritten input matches the character selected as being closer to the font, the number of character candidates is limited to 1. On the other hand, when the character determined from the character image input by handwriting does not match the character selected as being closer to the character pattern, each character is derived as a character candidate. That is, the control unit 21 that derives 2 radical candidates selects one candidate that is not selected from the derived radical candidates (step S203). When one unselected candidate, for example, "partial" and "side" shown in fig. 9 (a), is selected, the control unit 21 derives [2] from [ partial ] and [ side ] as the number of constituent elements of the stroke data for the selected partial type candidate (step S204).
Next, the control section 21 selects one of the components input by the user, for example, [ bias ] (step S205). The control unit 21 selects one of the components [ partial ] of the closest character [ consistent ] selected by the user, for example, [ water of three points ] (step S206).
The control unit 21 further determines whether or not the selected component, for example, [ three-point water ], can omit the replacement or other processing described later (step S207). This is to reduce the number of processing steps such as replacement of unnecessary stroke data input by the user and each component closest to the character by providing restrictions such as components not located next to the "partial" component.
In step S207, for example, when the constituent selected based on the stroke data input by the user is "partial" and the closest character "is selected as the constituent and is" close "to the side, it is determined that the selection is possible (yes in step S207), the control unit 21 regards the processing for the constituent selected at this point in time as being omitted, and returns to the processing from step S206, and the next closest character-shaped constituent is selected.
If it is determined in step S207 that the selected component cannot be omitted (no in step S207), the control unit 21 replaces the component on the side of [ deviation ] selected by the stroke data input by the user with the same component [ water of triple point ] closest to the character [ consistent ], for example (step S208). Then, the kanji candidate based on the replaced component is stored as corrected stroke data (step S209).
The control unit 21 determines whether or not selection of all the nearest components is completed, and thereby confirms that: if there are unselected components of the closest character and not all the components of the closest character have been selected (step S210).
If it is determined in step S210 that the selection of all the nearest components has not been completed (no in step S210), the process returns to step S206 and the process corresponding to the other components of the nearest character is executed.
As described above, the side of the constituent element [ side ] of the stroke data input by the user is replaced with the side constituent element [ sum ] closest to the character [ side ] and the processing is performed as the correction stroke data.
If it is determined in step S210 that the selection of all the components of the closest character is completed and there is no unselected component of the closest character (yes in step S210), the control unit 21 determines whether or not the selection of all the components of the stroke data input by the user is completed, and confirms that: if there are not selected components and the selection of all the components is not completed (step S211).
In step S211, if it is determined that the selection of all the components of the stroke data input by the user is not completed and there is an unselected component (no in step S211), the control unit 21 returns to the processing from step S205 and performs the same processing on the other components of the stroke data input by the user.
Further, when it is determined in step S211 that the selection of all the components of the stroke data input by the user is completed and there is no unselected component (yes in step S211), the control unit 21 determines whether or not the selection of all the radical candidates is completed, and confirms that: if there are unselected radical candidates and not all candidates are selected (step S212).
If it is determined in step S212 that the selection of all the radical type candidates is not completed (no in step S212), the control unit 21 returns to the processing from step S203 and executes the same processing based on the unselected radical type candidates.
If it is determined in step S212 that the selection of all the radical candidates has been completed (yes in step S212), the control unit 21 ends the subroutine related to the correction processing of the stroke data in fig. 4 and returns to the main routine in fig. 3.
As described above, when the radical type determined from the character image input by handwriting matches the radical type of the character selected as being closer to the font, the number of radical candidates is limited to 1.
On the other hand, when the radical type determined from the character image input by handwriting does not match the radical type of the character selected as being closer to the font, 2 radical type candidates are derived, and therefore the processing from step S203 and subsequent steps is repeatedly executed.
Fig. 11 shows an example of corrected stroke data in which stroke data input by the user is replaced with a component of the closest character of the radical type corresponding to the character [ uniform ] long-pressed by the user on the candidate character display unit 44 of the touch panel display unit 17. Fig. 11 (a) shows the selected candidate chinese character of the closest character shape [ consistent ] for the stroke data input by the user. The part corresponding to the deviation Rix1, i.e., [ three dots of water ], becomes the correction stroke data 2 shown in fig. 11 (C), and similarly, the part corresponding to the side Rix2, i.e., [ close ], becomes the correction stroke data 1 shown in fig. 11 (B). In fig. 11, the final corrected stroke data number is set to "2", and character recognition processing described later is performed on these data.
In fig. 3, after the correction processing of the stroke data is executed in step S104, the handwriting recognition processing is sequentially executed for the plurality of correction data, and corrected kanji candidates as the recognition results for the plurality of correction data are derived (step S105).
The control unit 21 re-displays the list of the derived corrected kanji candidates on the touch panel display unit 17 (step S106), and returns to the processing from step S103 in order to wait for an operation on the displayed kanji candidates.
Fig. 7 (B) is a diagram illustrating a state in which chinese character candidates are displayed by referring to stroke data of handwritten input through the above-described series of processing from a state in which chinese character [ consistent ] is selected by a long-press operation with the stylus pen P in fig. 7 (a). In fig. 7 (B), the center of the corrected kanji candidate display unit 45 is set to the position of the selected kanji [ qia ], and 2 kanji candidates [ ha ], [ fit ] corresponding to the side [ qia ] are displayed in a list on the left side 45A thereof, and similarly, 2 kanji candidates [ bath ], [ edge ] are displayed in a list on the right side 45B thereof with the kanji [ qia ] therebetween. The 2 kanji candidates [ ha ] and [ just ] on the left side 45A are candidates derived by performing the handwriting recognition processing in step S105 on the corrected stroke data 1 shown in fig. 6 (C). On the other hand, the 2 kanji candidates [ bath ] and [ edge ] on the right side 45B show candidates derived by performing the handwriting recognition processing in step S105 on the corrected stroke data 2 shown in fig. 6 (D).
In this case, the same radical type can be expressed by displaying the kanji characters of the same radical type, for example, [ ha ] and [ fit ], [ bath ] and [ edge ] surrounded by frames of the same color, respectively.
Further, by performing the divisional display so that the type, display position, and the like of the outline can be visually recognized for each identical radical type, not limited to colors, the user can easily understand the result of the character recognition.
When the user of the electronic dictionary 10 finds a desired kanji among the kanji candidates displayed in the list, the user selects the desired kanji by a touch operation.
As described above with reference to fig. 3, the control unit 21 waits for any input by repeatedly determining whether or not the one candidate that is considered to be closest is long-pressed (step S103), whether or not the one candidate is selected (step S107), and whether or not the cancel of the list display is performed (step S109).
Therefore, when one character desired by the user is selected by the touch operation, the control unit 21 determines in step S107 that one candidate is selected (yes in step S107), specifies the selected kanji candidate, inputs a character string that has been input next (step S108), and terminates the processing of fig. 3.
Fig. 8 (a) shows a state where the kanji candidate [ bath ] on the right side 45B of the corrected kanji candidate display unit 45 is touched from the display state of fig. 7 (a). In response to the touch operation, the control unit 21 determines the kanji candidate [ bath ] as the character of the character string [ daylight ] that is input next to the search string input unit 41.
Therefore, as shown in fig. 8 (B), the search string input unit 41 replaces the string [ sunlight letter ] containing the erroneously recognized character [ letter ] in the temporarily input state with the result [ sunbathing ] specified in association with the correction processing. At the same time, the handwritten character input unit 43 clears the display contents in preparation for the next handwritten input after the chinese character is specified.
As described above, even when a desired chinese character is not recognized by handwriting input, the stroke data input by the user is decomposed for each radical of the chinese character to obtain corrected stroke data and character analysis is performed anew as an operation different from a normal touch operation, for example, by performing a long press operation or the like on another character that is considered to be closest to the recognition result. Therefore, even when a part of the handwritten input kanji is input in a somewhat incorrect form, the correction can be made to improve the possibility that a desired kanji can be obtained, which is advantageous for the efficiency of the handwritten input.
In addition, when selecting a dictionary function such as a japanese dictionary, for example, a large number of candidates can be prevented from being displayed in a list excessively by considering character strings already input before and after a character handwritten and input in a search character string input by a user as a part of an entry and excluding the character which is considered not to be a candidate from the candidates.
In addition, in the display screen of the touch panel display unit 17 shown in fig. 7 (B) and fig. 8 (a), when the desired kanji is not displayed even when either of the kanji candidates 45A and 45B of the kanji candidate correction display unit 45 is displayed, the stroke data correction processing shown in fig. 4 is repeatedly executed by selecting again the character whose shape is considered to be closest to the selected character and performing the long stroke operation.
Further, when a desired kanji is not displayed as a candidate even if a kanji whose shape is considered to be close to the desired kanji is repeatedly selected several times by the long-press operation, it is also conceivable that the handwritten character input unit 43 performs handwriting input itself again. In such a case, the recognition accuracy can be improved by performing a selection process of reducing the occurrence rate of a character that has not been selected as a desired character while being displayed as a candidate before the handwriting input.
[ Effect of the embodiment ]
According to the present embodiment as described in detail above, selection of a desired character can be accepted from a wide range of candidates by effectively utilizing the contents of handwriting input.
In particular, in the present embodiment, corrected stroke data is obtained by deriving stroke data handwritten by the user as radical candidates based on a result of analyzing how each component of each radical can be divided by referring to the radical tables stored in the various table storage areas 22a 1. By adopting such a processing procedure, although other image processing for performing weighting based on learning in character recognition processing, machine learning data, and the like may be performed in a lump, for example, the load of circuit processing in the control unit 21 can be reduced in any case, and the recognition result can be acquired more quickly.
In addition, although not described in the present embodiment, when the user finally selects a candidate of a desired character, the selected character and the character input by handwriting may be divided in correspondence with each of the radical types, and the correspondence relationship may be updated and stored as learning data in the various table storage areas 22a1 for each of the divided parts. In this case, the various table storage areas 22a1 function as storage units that store information indicating the association between the parts into which the handwritten input character is divided and the corresponding parts of the character that is the result of the specification, as information used when the control unit 21 re-performs character recognition. When the same handwriting is included in the character recognition of the handwriting input next time or later, the handwriting habit of the user can be learned more and the recognition rate can be improved more by automatically correcting the handwriting according to the learning content.
The present invention is not limited to the above-described embodiments, and various modifications can be made in the implementation stage without departing from the gist thereof. In addition, the respective embodiments can be combined as appropriate as possible, and in this case, the combined effect can be obtained. Further, the embodiments described above include inventions at various stages, and various inventions can be extracted by appropriate combinations of a plurality of disclosed constituent elements. For example, in the case where the problems described in the section of the problems to be solved by the invention can be solved even if some of the constituent elements are deleted from the whole constituent elements shown in the embodiments, and the effects described in the section of the effects of the invention can be obtained, the configuration in which the constituent elements are deleted can be extracted as the invention.
The various programs described above can be stored and distributed as programs executable by a computer on a non-transitory computer-readable recording medium such as a memory card (ROM card, RAM card, or the like), a magnetic disk (Floppy (registered trademark) disk, hard disk, or the like), an optical disk (CD-ROM, DVD, or the like), or a semiconductor memory. A control unit (CPU) of the electronic device reads the program recorded in the recording medium into the storage device, and controls the operation using the read program, thereby realizing the various functions described in the above-described embodiments.

Claims (8)

1. A character recognition device is characterized by comprising:
an input unit that accepts a character image based on an input operation in a handwriting form;
a character derivation unit that performs recognition processing on the character image received by the input unit and derives a1 st character from the character image; and
a display control unit that displays the 1 st character derived by the character derivation unit as a candidate of a character desired by a user,
the input unit is capable of accepting a designation operation indicating that the 1 st character is not a character desired by the user but a character similar to the character desired by the user,
the character deriving section further derives a 2 nd character having a given correlation with the 1 st character when the specifying operation is received by the input section,
the display control unit causes the 2 nd character derived by the character deriving unit to be displayed as an additional candidate of the character desired by the user.
2. The character recognition apparatus according to claim 1,
the character deriving unit may derive the 2 nd character by using a correlation with the character image received by the input unit.
3. The character recognition apparatus according to claim 1 or 2,
the character derivation unit derives the 2 nd character from a predetermined relationship with a part corresponding to the 1 st character for each part divided according to the density of the character image.
4. The character recognition apparatus according to any one of claims 1 to 3,
the character deriving unit derives a plurality of the 2 nd characters having a predetermined relationship with the 1 st character in correspondence with the parts divided by the density of the character image,
the display control unit displays the derived 2 nd characters by differentiating the 2 nd characters in correspondence with parts divided by the density of the character image.
5. The character recognition apparatus according to any one of claims 1 to 4,
the character deriving unit derives the 2 nd character based on a correlation with the input character string when the character image received by the input unit is input following the input character string.
6. The character recognition apparatus according to any one of claims 1 to 5,
the input unit is capable of accepting an operation of specifying the 2 nd character displayed as an addition candidate,
the character recognition device further includes:
and a storage unit that stores information indicating a relationship between a part divided by the density of the character image and a part corresponding to the 2 nd character subjected to the specifying operation, as information used when the character derivation unit derives the 2 nd character.
7. A character recognition method is a character recognition method for a device including:
an input unit that accepts a character image based on an input operation in a handwriting form;
a character derivation unit that performs recognition processing on the character image received by the input unit and derives a1 st character from the character image; and
a display control unit that displays the 1 st character derived by the character derivation unit as a candidate of a character desired by a user,
in the character recognition method, the input unit may receive a designation operation indicating that the 1 st character is not a character desired by the user but a character similar to the character desired by the user,
the character deriving section further derives a 2 nd character having a given correlation with the 1 st character when the specifying operation is received by the input section,
the display control unit causes the 2 nd character derived by the character deriving unit to be displayed as an additional candidate of the character desired by the user.
8. A recording medium that is non-transitory and computer-readable, and that records a program, wherein a computer incorporated in an apparatus including:
an input unit that accepts a character image based on an input operation in a handwriting form;
a character derivation unit that performs recognition processing on the character image received by the input unit and derives a1 st character from the character image; and
a display control unit that displays the 1 st character derived by the character derivation unit as a candidate of a character desired by a user,
the program causes the computer to function as follows:
the input unit is capable of accepting a designation operation indicating that the 1 st character is not a character desired by the user but a character similar to the character desired by the user,
the character deriving section further derives a 2 nd character having a given correlation with the 1 st character when the specifying operation is received by the input section,
the display control unit causes the 2 nd character derived by the character derivation unit to be displayed as an additional candidate of the character desired by the user.
CN202210249269.8A 2021-03-24 2022-03-14 Character recognition device, character recognition method, and recording medium Pending CN115131799A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021050754A JP2022148901A (en) 2021-03-24 2021-03-24 Character recognition apparatus, character recognition method, and program
JP2021-050754 2021-03-24

Publications (1)

Publication Number Publication Date
CN115131799A true CN115131799A (en) 2022-09-30

Family

ID=83376683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210249269.8A Pending CN115131799A (en) 2021-03-24 2022-03-14 Character recognition device, character recognition method, and recording medium

Country Status (2)

Country Link
JP (1) JP2022148901A (en)
CN (1) CN115131799A (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0696266A (en) * 1992-09-11 1994-04-08 Hitachi Ltd Correction supporting system for character recognition result
JPH08123896A (en) * 1994-10-28 1996-05-17 Hitachi Ltd Handwritten character input device
JP3299875B2 (en) * 1995-11-27 2002-07-08 シャープ株式会社 Character processor
JPH1021325A (en) * 1996-06-28 1998-01-23 Baazu Joho Kagaku Kenkyusho:Kk Method for recognizing character
JP3452774B2 (en) * 1997-10-16 2003-09-29 富士通株式会社 Character recognition method
JP4698308B2 (en) * 2005-07-06 2011-06-08 シャープ株式会社 Character input device, character input program, and recording medium for recording character input program
KR20220003662A (en) * 2013-06-09 2022-01-10 애플 인크. Managing real-time handwriting recognition

Also Published As

Publication number Publication date
JP2022148901A (en) 2022-10-06

Similar Documents

Publication Publication Date Title
KR100931466B1 (en) Electronic dictionary device and dictionary search method of electronic dictionary device
JP5703711B2 (en) Electronic dictionary device and program
KR100930185B1 (en) Electronic device and dictionary input method with dictionary function
KR100704093B1 (en) Component-based, adaptive stroke-order system
JP2003162687A (en) Handwritten character-inputting apparatus and handwritten character-recognizing program
US20120092233A1 (en) Display control apparatus and display control method
KR101090658B1 (en) Electronic equipment equipped with dictionary function
EP2581836A2 (en) Electronic device and dictionary data display method
JP2000048215A (en) Data processor and medium recording control program for the same
CN115131799A (en) Character recognition device, character recognition method, and recording medium
US10049107B2 (en) Non-transitory computer readable medium and information processing apparatus and method
JP5652251B2 (en) Information display device and program
JP5482516B2 (en) Electronic device and program with dictionary function
US20120154436A1 (en) Information display apparatus and information display method
JP5487898B2 (en) Electronic device and program with dictionary function
JP6451790B2 (en) Electronic device having dictionary function, control program thereof, and display method
JP5381185B2 (en) Electronic device and information display program
KR20100131356A (en) Electronic apparatus with dictionary function
JP5011511B2 (en) Electronic device and program with dictionary function
JP5141130B2 (en) Electronic device and program having dictionary function
JP4967820B2 (en) Handwritten character input device and control program thereof
JP7515242B2 (en) Character input device, character input support method and program
CN110989894B (en) Electronic device, control method for electronic device, and recording medium on which program is recorded
JP6167508B2 (en) Electronic device, program, and display method
JP5561312B2 (en) Electronic device with dictionary function, search character input method and program in electronic device with dictionary function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination