AU2020233775A1 - Systems and methods for digital representation and recognition of online cursive writing - Google Patents

Systems and methods for digital representation and recognition of online cursive writing Download PDF

Info

Publication number
AU2020233775A1
AU2020233775A1 AU2020233775A AU2020233775A AU2020233775A1 AU 2020233775 A1 AU2020233775 A1 AU 2020233775A1 AU 2020233775 A AU2020233775 A AU 2020233775A AU 2020233775 A AU2020233775 A AU 2020233775A AU 2020233775 A1 AU2020233775 A1 AU 2020233775A1
Authority
AU
Australia
Prior art keywords
velocity
valley
input
peak
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2020233775A
Inventor
Gopal Krishna Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2019903491A external-priority patent/AU2019903491A0/en
Application filed by Individual filed Critical Individual
Publication of AU2020233775A1 publication Critical patent/AU2020233775A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Abstract

Embodiments generally relate to a computer implemented method for digital representation of handwritten cursive text. The method comprises receiving at least one 5 input relating to a piece of cursive written text, the input comprising at least two coordinate values and a time value; based on the at least one input and at least one previously received input, determining whether a local peak or valley occurred; and, if a local peak or valley is determined to have occurred, outputting a corresponding symbol. 1/5 E 00 E 00 bww EE a a 0 o x4 0 E r--o 4-.i

Description

1/5
E
00
E
00
bww
EE
a a 0
o x4 0
E
r--o
4-.i
"Systems and methods for digital representation and recognition of online cursive writing"
Technical Field Embodiments generally relate to methods, devices and systems for digital representation and recognition of online cursive writing.
Background Despite the prevalence of digital text input methods, cursive writing remains to be a common way to communicate and to store information. This can include writing produced in hardcopy, such as handwritten cursive notes and cards, addresses on envelopes, and completing empty fields on hardcopies of forms and documents, as well as digital cursive writing that a user may enter using a computing device such as a tablet. One disadvantage of such cursive writing is that it can be difficult for digital devices to decipher, often requiring a human operator to read and enter the information when it is to be entered in a digital system, even when the cursive text was produced using a digital device. This can produce delays, and may result in errors in the data entry.
It is desired to address or ameliorate one or more shortcomings or disadvantages associated with prior systems for digital representation of cursive writing and recognition of the digital representation.
Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present disclosure as it existed before the priority date of each of the appended claims.
Summary
Some embodiments relate to a computer implemented method for digital representation of cursive writing, the method comprising: receiving at least one input relating to a piece of handwritten cursive text, the input comprising at least two coordinate values and a time value; based on the at least one input and at least one previously received input, determining whether a local peak or valley has occurred; and if a local peak or valley is determined to have occurred, outputting a corresponding symbol.
According to some embodiments, the local peak or valley may be a local velocity peak or valley.
According to some embodiments, the at least one input further comprises a pen-up value, the method further comprising determining whether a pen-up event occurred, and if a pen-up event occurred, outputting a corresponding pen-up symbol.
In some embodiments, the at least two coordinates comprise an X coordinate and a Y coordinate, and wherein determining whether a local peak or valley occurred comprises determining whether a local X peak or X valley occurred and determining whether a local X peak or X valley occurred.
According to some embodiments, determining whether a local X peak or X valley occurred comprises generating an X profile comprising the X value and the time value, and comparing the X profile with a previously generated X profile.
Some embodiments further comprise: determining an X velocity, being the difference between the X value at a time T and the X value at a time T-1; determining an X peak if the X velocity is larger than the X velocity at T-1 and the X velocity at T+1 by more than a threshold value; and determining an X valley if the X velocity is smaller than the X velocity at T-1 and the X velocity at T+1 by more than a threshold value.
According to some embodiments, determining whether a local Y peak or Y valley occurred comprises generating an Y profile comprising the Y value and the time value, and comparing the Y profile with a previously generated Y profile.
Some embodiments further comprise: determining a Y velocity, being the difference between the Y value at a time T and the Y value at a time T-1; determining a Y peak if the Y velocity is larger than the Y velocity at T-1 and the Y velocity at T+1 by more than a threshold value; and determining a Y valley if the Y velocity is smaller than the Y velocity at T-1 and the Y velocity at T+1 by more than a threshold value.
In some embodiments, the output symbols are output to a string corresponding to a digital representation of the cursive text.
Some embodiments further comprise comparing at least a portion of the string with a database of strings to determine at least one character corresponding to the handwritten cursive text.
Some embodiments relate to a computing device comprising memory storing program code and a processor, wherein the memory is accessible by the processor and wherein, when the processor executes the program code, the processor is configured to perform a method comprising: receiving at least one input relating to a piece of cursive text, the input comprising at least two coordinate values and a time value; based on at the at least one input and at least one previously received input, determining whether a local peak or valley occurred; and if a local peak or valley is determined to have occurred, outputting a corresponding symbol.
Some embodiments further comprise a communications module, wherein the at least one input is received via the communications module.
According to some embodiments, the computing device is a server, and the at least one input is received from a remote computing device.
In some embodiments, at least part of the at least one input is generated by a touchscreen device. In some embodiments, at least part of the at least one input is generated by a stylus device.
Brief Description of Drawings Embodiments are described in further detail below, by way of example and with reference to the accompanying drawings, in which:
Figure 1 shows a block diagram of a system for the digital representation and recognition of online cursive text; Figure 2 shows a method performed by the system of Figure 1; Figure 3 shows a method performed by the processing program of Figure 1; and Figure 4 shows an example input into the system of Figure 1; and Figure 5 shows a diagram indicating example the stroke directions that correspond to example symbol outputs of the processing system of Figure 1.
Detailed Description Embodiments generally relate to methods, devices and systems for digital representation and recognition of online cursive text;
Specifically, embodiments relate to a system and method for the digital representation and recognition of text written in cursive handwriting, particularly to cursive text written on an electronic device such as a tablet or smart phone.
Digital recognition and processing of cursive writing is difficult to perform accurately, as no two online cursive writings are precisely the same, even when written by the same individual. Among other properties, the scale and orientation of handwritten characters will vary every time they are written.
Known cursive text recognition methods may generally be classified in one of the two distinct families of techniques. The first of these two are often called "formal structural and rule based methods" and the others are labelled "statistical classification methods". Described embodiments fall outside either of these two known types of approaches.
Described embodiments specifically relate to methods, devices and systems for the digital representation and recognition of cursive text that is produced via a digital medium, such as using a computing device having a touch screen to enter text using a stylus. Figure 1 shows an example system 100 for cursive writing recognition according to some embodiments.
System 100 includes a computing device 110 and a stylus 120. Computing device 110 may be a mobile phone, tablet, laptop or other computing device capable of receiving handwritten cursive text via a user interface. Computing device 110 includes a processor 111 and memory 112 accessible to processor 111. Memory 112 stores a handwriting capture program 113 executable by processor 111 to cause computing device 110 to capture cursive writing entered by a user.
Computing device 110 further includes user input and output 114, which may include a touch-screen display 115 via which a user is able to input cursive writing. A user may use a designated input device such as stylus 120, or may interact with touch-screen display 115 using their finger or hand in some embodiments. According to some embodiments, stylus 120 may be a stylus pen or digital pen configured to allow for the digital capture of strokes drawn or written by a user on touch-screen display 115 or on another surface.
Some embodiments may include a stylus 120 that has no digital components, and is merely a physical tool used to interact with touch-screen 115, with touch-screen 115 recording movement of stylus 120 based on the location at which stylus 120 exerts pressure on touch-screen 115. In some embodiments, stylus 115 may include digital components to allow a user to control at least some aspects of their strokes, such as a colour of the produced stroke, for example, using user input components on stylus 120.
In some embodiments, such as in the illustrated embodiment, stylus 120 may include digital components that assist with tracking the position of stylus 120 with respect to a surface such as touch-screen display 120. For example, stylus 120 as illustrated includes a sensor 122 to detect the position and pen-up status of stylus 120. According to some embodiments, sensor 122 may be a pressure sensor. Stylus 120 further includes a communications device 124 for communicating the sensor readings from sensor 122 to communications module 116 of computing device 110, for processing by processor 111 executing cursive writing capture program 113. According to some embodiments, sensor 122 may record a sensor reading around 200 times per second. Communications module 124 may communicate the sensor readings generated by sensor 122 in real-time or periodically.
In some embodiments, stylus 120 may be configured to track its own position with respect to a reference point, such that stylus 120 can be used to digitally capture strokes drawn or written by a user on a non-touch-screen surface, such as on a tabletop or sheet of paper.
Computing device 110 further includes a communications module 116 for facilitating communication between computing device 110 and external devices. For example, system 100 includes a server 130 and a network 140. Network 140 may be the internet in some embodiments. Communications module 116 is configured to facilitate communication between computing device 110 and server 130 via network 140. Communications module 116 may facilitate communication via a wired communication protocol, such as USB or Ethernet, or via a wireless communication protocol, such as Wi-Fi, Bluetooth or NFC, for example.
Server 130 may be a single server, a server system, a cloud-based server or server system, or other computing device providing centralised servers to computing devices such as computing device 110. Server 110 comprises a processor 131, and memory 132 accessible to processor 131. Memory 132 stores a writing processing program 133, which is executable by processor 131 to cause server 130 to process writing input received via network 140, such as from computing device 110. Server 130 includes a communications module 146 to facilitate communications between server 130 and other devices such as computing device 110 via network 140. Communications module 146 may facilitate communication via a wired communication protocol, such as USB or Ethernet, or via a wireless communication protocol, such as Wi-Fi, Bluetooth or NFC, for example.
System 100 may be configured to perform digital processing and recognition of handwritten cursive text entered by a user, as described in further detail below. For example, a user may use stylus 120 to handwrite on touch-screen display 115. Processor 111 executing handwriting capture program 113 causes the input signals as detected by touch screen display 115 to be recorded and communicated to server 130 via network 140 by communications module 160. Server 130 receives the input signals at communications module 146. Processor 131 executes cursive writing processing program 133 to process the input signals, and translate them to digital text. The digital text is then sent back from communications module 146 of server 130 to communications module 116 of computing device 110 via network 140.
Figure 2 shows a method 200 of cursive writing capture and processing as performed by system 100. Method 200 begins at step 210, with a user using stylus 120 to write on touchscreen display 115 of computing device 110. At step 220, movement data generated by the relative movement between stylus 120 and touchscreen display 115 is captured by computing device 110. Specifically, touchscreen display 115 receives touch input, which is sent to handwriting capture program 113. Alternatively or additionally, sensor 122 receives touch input, which is communicated by communications module 124 of stylus 129 to handwriting capture program 113 by communications module 116 of computing device 110.
According to some embodiments, the data generated is in the form of a sequence of data points, with each data point including the values T, X, Y and P. T may represent a time, X may represent an x-coordinate or horizontal position of stylus 120 with respect to touchscreen display 115, Y may represent a y-coordinate or vertical position of stylus 120 with respect to touchscreen display 115, and P may represent a pen-up state. The number of data points or data quadruplets in a data sequence may depend on the time taken by the writer in writing the text, as well as on the sample rate of sensor 122 and touchscreen display 115. For example, a text written in 5 seconds with sampling at 200 samples a second may generate a sequence of 1000 data samples.
At step 230, the captured movement data received by writing capture program 113 is sent to server 130 via network 140. Processor 111 executing writing capture program 113 may cause the captured movement data to be sent via communications module 116. According to some embodiments, the captured data may be sent to server 130 in real time. According to some embodiments, the data may be sent on a periodic basis. In some embodiments, the data may be sent after the occurrence of a trigger event, such as a pen-up event, or a threshold amount of time passing without new data being received.
At step 240, having received the movement data via communications module 146, server 130 processes the data to derive digital text data from the movement data. This step is performed by processor 131 executing cursive writing processing program 133, and is described in further detail below with respect to Figure 3. Having processed the data, at step 250 server 130 sends the digital text data back to computing device 110 via communications module 146 for further processing and/or display.
Figure 3 shows a method 300 performed by processor 131 executing writing processing program 133. Method 300 outputs a digital representation of handwritten text, as received in the form of digital data generated by stylus 120 and/or touchscreen display 115. While described embodiments show method 300 being executed entirely on server 130, according to some embodiments, some or all of the steps of method 300 may in fact be performed by processor 111 of computing device 110.
Method 300 starts at step 305, where data generated by touchscreen display 115 and/or sensor 122 is received by writing processing program 133, having been captured by cursive writing capture program 113 and transmitted to server 130 via network 140. At step 310 processor 131 determines whether all of the received data has been processed. If it has, at step 315 processor 131 stops processing the data. At step 317, the digital text sequence represented by the processed data is determined, as described in further detail below.
If processor 131 determines at step 310 that there is further unprocessed data, processor 131 instead proceeds to execute step 320. At step 320, processor 131 receives the next data point in the received data. The data point comprises four values, represented as T, X, Y and P. T represents a time, X represents an x-coordinate or horizontal position of stylus 120 with respect to touchscreen display 115, Y represents a y-coordinate or vertical position of stylus 120 with respect to touchscreen display 115, and P represents a pen-up state.
At step 325, processor 131 determines X and Y profiles for the data point. The X profile for the data point comprises the T, X and P values, while the Y profile comprises the T, Y and P values.
At step 330, processor 131 processes the X profile. This involves processor 131 estimating the relative X-velocity between stylus 120 and touchscreen display 115 at the time T, by determining the difference between the X value at T-1, and the X value at T+1.
At step 335, processor 131 then determines whether the current X profile corresponds to a pen-up, X-peak or X-valley. A pen-up status is determined by processor 131 analysing the P value for the current X-profile. According to some embodiments, a P value of 0 indicates a pen-down status, while a P value of1 indicates a pen-up status.
Processor 131 may determine whether an X-peak or X-valley exists by comparing the current X-velocity with the velocity at T-1 and the velocity at T+1. If the current X velocity is larger than the velocity at T-1 and the velocity at T+1 by more than a threshold value, processor 131 determines that an X-peak exists. If the current X velocity is smaller than the velocity at T-1 and the velocity at T+1 by more than a threshold value, processor 131 determines that an X-valley exists. The threshold value may be 1, 2, 3, 4 or 5 according to some embodiments. Use of a threshold value allows processor 131 to reject small localised peaks and valleys as noise.
If at step 335 processor 131 determines that a pen-up, X-peak or X-valley exists, then at step 340 processor 131 outputs a symbol corresponding to a pen-up, X-peak or X valley condition. The output may be stored as a string in memory 132. According to some embodiments, the symbol corresponding to a pen-up condition may be a P. The symbol corresponding to an X-peak may be an A. The symbol corresponding to an X valley condition may be a B. According to some embodiments, a different set of symbols may be used.
Whether or not a pen-up, X-valley or X-peak condition is determined at step 335, processor 131 then executes step 345 by processing the Y profile. This involves processor 131 estimating the relative Y-velocity between stylus 120 and touchscreen display 115 at the time T, by determining the difference between the Y value at T-1, and the Y value at T+1.
At step 350, processor 131 then determines whether the current Y profile corresponds to a pen-up, Y-peak or Y-valley. A pen-up status is determined by processor 131 analysing the P value for the current Y-profile. According to some embodiments, a P value of 0 indicates a pen-up status, while a non-zero P value indicates a pen-down status.
Processor 131 may determine whether a Y-peak or Y-valley exists by comparing the current Y-velocity with the velocity at T-1 and the velocity at T+1. If the current Y velocity is larger than the velocity at T-1 and the velocity at T+1 by more than a threshold value, processor 131 determines that a Y-peak exists. If the current Y velocity is smaller than the velocity at T-1 and the velocity at T+1 by more than a threshold value, processor 131 determines that a Y-valley exists. The threshold value may be 1, 2, 3, 4 or 5 according to some embodiments. Use of a threshold value allows processor 131 to reject small localised peaks and valleys as noise.
If at step 335 processor 131 determines that a pen-up, Y-peak or Y-valley exists, then at step 340 processor 131 outputs a symbol corresponding to a pen-up, Y-peak or Y valley condition. The output may be stored as a string in memory 132, adding the output symbols to any symbols already stored. According to some embodiments, the symbol corresponding to a pen-up condition may be a P. The symbol corresponding to a Y-peak may be a C. The symbol corresponding to a Y-valley condition may be a D. According to some embodiments, a different set of symbols may be used.
Whether or not a pen-up, Y-valley or Y-peak condition is determined at step 350, processor 131 then re-executes step 310, by determining whether further unprocessed data exists. If there is further unprocessed data, processor 131 takes the next (T, X, Y, P) input, and continues the method starting at step 320. Once all of the data points have been processed, processor 131 determines that no further data exists, and stops processing at step 315.
The digital text sequence that has been created and output by processor 131 is determines at step 317, as a series of symbols, which according to the described embodiments would be a series containing the letters P, A, B, C and D. This series is then processed as described in further detail below. The text sequence or string is a digital representation of the handwriting represented by the processed data, and can be further processed to interpret the particular letters and numbers that have been handwritten.
An example set of data points processed by a processor 131 is reproduced below in Table 1. Column 1 shows the time value T, column 2 show the X-value, column 3 shows the Y-value, and column 4 shows the P state. Column 5 shows XV, being the X velocity value, calculated by subtracting the X-value at T-1 from the X-value at T+1. For example, the X-value at T=160 is 5663, and the X-value at T=162 is 5429. The XV value for T=161 can therefore be found by:
XV at T 161= (X-value at T+1) - (X-value at T-1) = (X-value at T 162) - (X-value at T 160) = 5429- 5663
= - 234
Similarly, column 6 shows YV, being the Y velocity value, calculated by subtracting the Y-value at T-1 from the Y-value at T+1. Column 7 shows the extrema value output by processor 131executing handwriting processing program 133 and performing method 300. If processor 131 were to process each of the data points in Table 1, the output string would therefore be DBACBPAC.
XV YV P Pen X Y Contact X Velocity Y Velocity Extrema TIME Value Value Symbol 160 5663 6729 1023 NA NA 161 5572 6691 1023 -234 -38 162 5429 6635 1023 -308 -56 D 163 5264 6582 1023 -345 -53 B 164 5084 6530 1023 -293 -52 165 4971 6484 1023 -155 -46 166 4929 6463 1023 -36 -21 167 4935 6454 1023 20 -9 168 4949 6446 1023 67 -8 169 5002 6438 1023 132 -8 170 5081 6438 1023 194 0 171 5196 6438 1023 232 0 A 172 5313 6448 1023 205 10 C 173 5401 6455 1023 157 7 174 5470 6461 1023 120 6 175 5521 6469 1023 76 8 176 5546 6469 1023 33 0 177 5554 6474 1023 8 5 B 178 5554 6474 869 0 0 179 5554 6474 594 0 0 180 5554 6474 44 0 0 181 5554 6474 0 0 0 P 182 5571 6487 0 0 0 183 5598 6499 0 0 0 184 5634 6499 0 0 0 185 5678 6508 0 0 0 186 5723 6508 0 0 0 187 5758 6514 0 0 0
188 5758 6514 345 39 0 189 5797 6534 694 39 20 190 5797 6555 759 0 21 191 5797 6605 879 25 50 192 5822 6674 976 72 69 193 5869 6774 1023 108 100 194 5930 6913 1023 142 139 195 6011 7079 1023 162 166 AC 196 6092 7245 1023 125 166 197 6136 7412 1023 64 167 198 6156 7554 1023 20 142 199 6156 7679 1023 NA NA
Table 1: example data stream processedby processor 131 performing method 300 by executing cursive writingprocessingprogram 133
According to some embodiments, processor 131 will never output the same symbol twice, as no peaks or valleys should occur twice in a row, and a pen-up status is only counted once regardless of how long the pen-up status lasts.
Figure 4 shows an example input 400 that might be drawn by a user, demonstrating a symbol output string that may be generated by processor 131 when processing data generated by a stylus 120 drawing input 400.
The user may put their pen down at point 405, and begin by moving the pen diagonally to the left and upwards. Processor 131 would therefore generate symbols A and D when processing point 405. Between point 405 and 410, no pen-ups, peaks or valleys are detected. At point 410, processor 131 determines a Y-peak to occur, outputting a symbol C. Between point 410 and 415, no pen-ups, peaks or valleys are detected, until processor 131 determines an X-valley to occur at point B, outputting a symbol B. This continues with processor 131 determining a Y-valley and outputting a symbol D at point 420, determining an X-peak and outputting a symbol A at point 425, determining a Y-peak and outputting a symbol C at point 430, determining an X-valley and outputting a symbol B at point 435, determining an X-peak and outputting a symbol A at point 440, determining a Y-valley and outputting a symbol D at point 445, determining an X-valley and outputting a symbol B at point 450, and finally determining an X-peak and a Y-peak and outputting symbols A and C at point 455. At that point, processor 131 may also determine a pen-up status, and output a symbol P.
Having processed each of the data points generated by stylus 120 and/or touchscreen 115 resulting from a user drawing the input 400 on touchscreen 115 using stylus 120 by executing method 300, the string output by processor 131 may therefore be: ADCBDACBADBACP.
Figure 5 shows a symbolic representation 500 of the directions of movement made by a stylus 120 with respect to touchscreen display 115 and the corresponding output symbols when the output symbols are interpreted as pairs. For example, the symbols ABC can be interpreted as pairs of symbols A-B and B-C, indicating movement from an X-peak to an X-valley and from an X-valley to a Y-peak. The symbols A, B, C and D combine to give 12 unique pairs, being: AB, AC, AD, BA, BC, BD, CA, CB, CD, DA, DB and DC.
Starting at segment 505, pairs D-A or D-C may indicate movement in a generally upward or Northward direction. Segment 510 indicates that pair B-C indicates movement in a generally upward and lefthand, or North-East direction. Segments 515 and 520 indicate that pair B-A indicates movement in a generally lefthand, or Eastward direction.
Segment 525 indicates that pair B-D indicates movement in a generally downward and lefthand, or South-East direction. Segment 530 indicates that pairs C-A and C-D indicate movement in a generally downward, or Southward direction. Segment 535 indicates that pairs C-B and C-D also indicate movement in a generally downward, or Southward direction.
Segment 540 indicates that pair A-D indicates movement in a generally downward and rightward, or South-West direction. Segments 545 and 550 indicate that pair A-B indicates movement in a generally rightward, or Westward direction.
Segment 555 indicates that pair A-C indicates movement in a generally upward and rightward, or North-West direction. Segment 560 indicates that pairs D-A and D-C indicate movement in a generally upward, or Northward direction.
While representation 500 shows 12 different directions, it is noted that the directions do not necessarily exactly correlate to all possible movement directions that result in the symbol pair indicated. However, the motion is usually in the quarter in which the symbols are shown. If the direction of motion was represented too accurately, the system may deal poorly with any variation in handwritten characters.
The symbol pairings and relative directions are summarised below in Table 2.
Pair Direction Stylus motion AB x peak to x valley Stylus goes W or NW or SW AC x peak to y peak Stylus goes NW AD x peak to y valley Stylus goes SW BA x valley to x peak Stylus goes E or NE or SE BC x valley to y peak or DC Stylus goes NE BD x valley to y valley Stylus goes SE CA y peak to x peak Stylus goes SE CB y peak to x valley Stylus goes SW CD y peak to y valley Stylus goes S or SE or SW DA y valley to x peak Stylus goes NE DB y valley to x valley Stylus goes NW DC y valley to y peak Stylus goes N, NE or NW Table 2: summary of directions correspondingto symbol pairings
Having created an output string, being a digital representation of the input handwriting, the output string can be compared to a database of common strings to determine the particular letters written by the user using device 110 and stylus 120. For example, the string "ADCBDACBADBACP" may correspond to a lower case letter "G", as described above with reference to Figure 4. In some embodiments, machine learning may be used to determine which sequence or sequences correspond to which particular letter, number, or symbol. In some embodiments, training data may be supplied to the system, to allow a training database of characters and their digital representations to be compiled. The database can then be searched to determine what character a particular digital string correlates to.
Using the process described above, the method can be applied to any symbols, including letters of languages other than English. Furthermore, as the method described only analyses the local peaks and valleys of a piece of handwritten text, the method is invariant as far as position and size of the handwritten text are concerned.
It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.

Claims (1)

  1. CLAIMS: 1. A computer implemented method for digital representation of handwritten cursive text, the method comprising: receiving at least one input relating to a piece of cursive written text, the input comprising at least two coordinate values and a time value; based on the at least one input and at least one previously received input, determining whether a local peak or valley occurred; and if a local peak or valley is determined to have occurred, outputting a corresponding symbol.
    2. The method of claim 1, wherein the at least one input further comprises a pen-up value, the method further comprising determining whether a pen-up event occurred, and if a pen-up event occurred, outputting a corresponding pen-up symbol.
    3. The method of claim 1 or claim 2, wherein the at least two coordinates comprise an X coordinate and a Y coordinate, and wherein determining whether a local peak or valley occurred comprises determining whether a local X peak or X valley occurred and determining whether a local X peak or X valley occurred.
    4. The method of claim 3, wherein determining whether a local X peak or X valley occurred comprises generating an X profile comprising the X value and the time value, and comparing the X profile with a previously generated X profile.
    5. The method of claim 4, further comprising: determining an X velocity, being the difference between the X value at a time T and the X value at a time T-1; determining an X peak if the X velocity is larger than the X velocity at T-1 and the X velocity at T+1 by more than a threshold value; and determining an X valley if the X velocity is smaller than the X velocity at T-1 and the X velocity at T+1 by more than a threshold value.
    6. The method of any one of claims 3 to 5, wherein determining whether a local Y peak or Y valley occurred comprises generating a Y profile comprising the Y value and the time value, and comparing the Y profile with a previously generated Y profile.
    7. The method of claim 6, further comprising: determining a Y velocity, being the difference between the Y value at a time T and the Y value at a time T-1; determining a Y peak if the Y velocity is larger than the Y velocity at T-1 and the Y velocity at T+1 by more than a threshold value; and determining a Y valley if the Y velocity is smaller than the Y velocity at T-1 and the HY velocity at T+1 by more than a threshold value.
    8. The method of any one of claims 1 to 7, wherein the output symbols are output to a string corresponding to a digital representation of the handwritten text.
    9. The method of claim 8, further comprising comparing at least a portion of the string with a database of strings to determine at least one character corresponding to the handwritten text.
    10. A computing device comprising memory storing program code and a processor, wherein the memory is accessible by the processor and wherein, when the processor executes the program code, the processor is configured to perform a method comprising: receiving at least one input relating to a piece of handwritten text, the input comprising at least two coordinate values and a time value; based on at the at least one input and at least one previously received input, determining whether a local peak or valley occurred; and if a local peak or valley is determined to have occurred, outputting a corresponding symbol.
    11. The computing device of claim 10, further comprising a communications module, wherein the at least one input is received via the communications module.
    12. The computing device of claim 11, wherein the computing device is a server, and the at least one input is received from a remote computing device.
    13. The computing device of any one of claims 10 to 12, wherein at least part of the at least one input is generated by a touchscreen device.
    14. The computing device of any one of claims 10 to 14, wherein at least part of the at least one input is generated by a stylus device.
    Computing device 110
    Processor 111
    Memory 112 Server 130
    120 Handwriting capture program Processor 113 131 1/5
    User I/O 114 Memory 132
    Touch‐screen display Handwriting processing program 115 133
    124 Communications module Network Communications module 116 140 146 122
    Figure 1
AU2020233775A 2019-09-19 2020-09-18 Systems and methods for digital representation and recognition of online cursive writing Pending AU2020233775A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2019903491A AU2019903491A0 (en) 2019-09-19 Systems and methods for digital recognition and representation of handwritten text
AU2019903491 2019-09-19

Publications (1)

Publication Number Publication Date
AU2020233775A1 true AU2020233775A1 (en) 2021-04-08

Family

ID=75280484

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020233775A Pending AU2020233775A1 (en) 2019-09-19 2020-09-18 Systems and methods for digital representation and recognition of online cursive writing

Country Status (1)

Country Link
AU (1) AU2020233775A1 (en)

Similar Documents

Publication Publication Date Title
CN108027876B (en) System for recognizing multiple object inputs, method and product thereof
JP5211334B2 (en) Handwritten symbol recognition method and apparatus
CN100412861C (en) Apparatus and method for searching for digital ink query
CN102449640B (en) Recognizing handwritten words
Al-Helali et al. Arabic online handwriting recognition (aohr) a survey
TWI382352B (en) Video based handwritten character input device and method thereof
CN102073870A (en) Method for recognizing Chinese character handwriting on touch screen
US7142715B2 (en) Arabic handwriting recognition using feature matching
KR20180104678A (en) System and method for recognizing multiple object structures
CN114730241B (en) Gesture and stroke recognition in touch user interface input
JP2019508770A (en) System and method for beautifying digital ink
Gohel et al. On-line handwritten Gujarati character recognition using low level stroke
Abuzaraida et al. Problems of writing on digital surfaces in online handwriting recognition systems
US20220207899A1 (en) Stroke attribute matrices
CN104951811B (en) Row style line recognition methods and device applied to brush writing
JP4958236B2 (en) Method and apparatus for recognizing handwritten patterns
CN101364271B (en) Method for recognizing hand-written Chinese character strokes and recognition device
CN111738167A (en) Method for recognizing unconstrained handwritten text image
AU2020233775A1 (en) Systems and methods for digital representation and recognition of online cursive writing
Teja et al. A ballistic stroke representation of online handwriting for recognition
Xie et al. Prototype pruning by feature extraction for handwritten mathematical symbol recognition
Ramya et al. Comparison of SVM Kernel effect on online handwriting recognition: A case study with Kannada script
Urala et al. Recognition of open vocabulary, online handwritten pages in Tamil script
CN111104886A (en) Gesture recognition method, device, equipment and storage medium
K Jabde et al. A Comprehensive Literature Review on Air-written Online Handwritten Recognition