WO2004029866A1 - Method and system for three-dimensional handwriting recognition - Google Patents

Method and system for three-dimensional handwriting recognition Download PDF

Info

Publication number
WO2004029866A1
WO2004029866A1 PCT/IB2003/004102 IB0304102W WO2004029866A1 WO 2004029866 A1 WO2004029866 A1 WO 2004029866A1 IB 0304102 W IB0304102 W IB 0304102W WO 2004029866 A1 WO2004029866 A1 WO 2004029866A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
motion data
images
handwriting recognition
deriving
Prior art date
Application number
PCT/IB2003/004102
Other languages
French (fr)
Inventor
Yonggang Du
Jiawen Tu
Lei Feng
Xiaoling Shao
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to AU2003260877A priority Critical patent/AU2003260877A1/en
Priority to JP2004539329A priority patent/JP2006500680A/en
Priority to US10/528,938 priority patent/US8150162B2/en
Priority to EP03798309.5A priority patent/EP1546993B1/en
Publication of WO2004029866A1 publication Critical patent/WO2004029866A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/228Character recognition characterised by the type of writing of three-dimensional handwriting, e.g. writing in the air
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting

Definitions

  • the invention relates generally to handwriting recognition technologies, and more particularly to three-dimensional (3D) handwriting recognition methods and systems.
  • Handwriting recognition is a technique by which an intelligence system can recognize characters and other symbols written by hand. This technique has been very popular since it frees users from the keyboard, allowing users to write and draw in a more natural way. With the increasing demands from users, more and more devices now have incorporated handwriting recognition system to give users natural input experience. Handwriting recognition is particularly popular in various handheld devices that provides handwriting recognition of words with complex structures, e.g., Chinese characters, input of which using a conventional keyboard is very time consuming.
  • a handwriting input device provides a user with a friendly way of inputting information.
  • the minimum requirement for an inputting device is a mouse.
  • the user needs to press and hold down the primary mouse button, and then move the mouse pointer to form strokes of a word or character for generating the final word or character.
  • Popular handwriting input devices such as a pen stylus and tablet, are used on conventional handheld devices such as PDAs, or are connected to a computer through a serial or USB port. Handheld devices often use a pen stylus and a touch screen as a pad to allow users to perform recognition functions. Most handheld devices, such as PDAs, are equipped with this kind of input device.
  • Another kind of handwriting input device includes a pen that allows users to transfer data into a receiving unit, such as a cellular phone, a PDA or a PC by simply writing or drawing in their own natural handwriting on the comfort and space of a regular piece of paper.
  • a receiving unit such as a cellular phone, a PDA or a PC
  • the present invention gives users more flexibility and enjoyable writing experience by allowing users to freely write words or characters in a 3D space in a touchless way, without requiring any physical medium such as a pad or a tablet.
  • a handwriting recognition system that comprises an input device and a recognition device in communication with the input device.
  • the input device includes a three-dimensional (3D) motion detection sensor configured to generate 3D motion data in response to a 3D motion.
  • the motion detection sensor measures acceleration of the 3D motion in X, Y and Z axial directions to generate the 3D motion data.
  • the recognition device is configured to receive (e.g., by wireless means) the 3D motion data from the input device and derive corresponding two-dimensional (2D) images for handwriting recognition, based on the 3D motion data.
  • the recognition device calculates corresponding 3D coordinates based on the 3D motion data, constructs corresponding 3D tracks based the 3D coordinates, and derives the corresponding 2D images from the 3D tracks by mapping the 3D tracks onto a 2D plane for handwriting recognition.
  • FIG. 1 shows a three-dimensional handwriting recognition system according to one embodiment of the invention
  • FIG. 2 is a flow chart diagram illustrating a recognition process according to one embodiment of the invention.
  • FIG. 3A shows a 2D image of a Chinese character derived by mapping 3D tracks onto a 2D projection plane
  • FIG. 3B shows a final result of handwriting recognition process based on the 2D image in FIG. 3A;
  • FIG. 4 shows an external design of a 3D handwriting input device according to one embodiment of the invention.
  • FIG. 5 illustrates how the input device may be mounted.
  • FIG. 1 shows a three-dimensional handwriting recognition system 10 according to one embodiment of the invention.
  • system 10 comprises a handwriting input device 20, a recognition device 30 and an output device 40.
  • Input device 20 includes a 3D motion detection sensor 22, a control circuit 26 and a communication interface 28.
  • Recognition device 30 includes a processor 32, a memory 34, a storage device 36, and a communication interface 38.
  • FIG. 1 shows a three-dimensional handwriting recognition system 10 according to one embodiment of the invention.
  • system 10 comprises a handwriting input device 20, a recognition device 30 and an output device 40.
  • Input device 20 includes a 3D motion detection sensor 22, a control circuit 26 and a communication interface 28.
  • Recognition device 30 includes a processor 32, a memory 34, a storage device 36, and a communication interface 38.
  • FIG. 1 shows a three-dimensional handwriting recognition system 10 according to one embodiment of the invention.
  • system 10 comprises a handwriting input device 20, a recognition device 30 and an output device 40.
  • a user moves input device 20 to freely write words or characters in a 3D space, e.g., in the air.
  • Motion detection sensor 22 detects the 3D motion and communicates the 3D motion data and a sampling rate to recognition device 30 for handwriting recognition via a communication interface 28, such as Bluetooth, Zigbee, IEEE 802.11 , infrared, or a USB port.
  • the sampling rate may be a predetermined value set by an end user or a manufacturer based on factors such as the processing capability of the system.
  • the sampling rate may be dynamically determined and adjusted based on, for example, the speed of the motion. This can be done, for example, by first determining the speed of the initial motion associated with the handwriting.
  • the recognition device can dynamically adjust the sampling rate based on the speed of the motion at the last sampling point. The higher the speed is, the higher the sampling rate will be, and vice versa.
  • the accuracy of the handwriting recognition can be improved since only the optimal number of sampling points will be used for constructing the word or character. Further, lower power consumption is needed.
  • processor 32 Based on the motion data and the sampling rate received from input device 20, processor 32 calculates the corresponding 3D coordinates on the X, Y and Z axes and stores these coordinates in storage device 36. Using the calculated coordinates, processor 32 constructs the corresponding 3D tracks. The 3D tracks will then be projected onto a 2D plane to form 2D images which will be recognized using conventional handwriting recognition software. The final result is displayed on output device 40.
  • control circuit 26 of input device 20 provides a control signal to recognition device 30 via interface 28 to indicate separation of individual words or characters upon receiving a user provided external input. For example, the user may press a control button to cause control circuit 26 to generate the control signal after completion of writing a word or a character.
  • motion detection sensor 22 detects the 3D motion by measuring the acceleration of the movement along the X, Y and Z axes.
  • the piezoresistive-type tri-axial accelerating sensor commercially available from Hitachi Metals, Ltd., Tokyo, Japan, may be used as motion detection sensor 22.
  • This accelerating sensor in the form of an IC chip has the ability to simultaneously detect acceleration in the three axial directions (X, Y and Z).
  • the senor is highly sensitive and shock resistant and is a very small and thin semiconductor type 3 axial accelerating sensor. More information about this accelerating sensor is available on the following website http://www.hit.achi- metals.co.jp/e/prod/prod06/p06_10.html, which is hereby incorporated by reference.
  • FIG. 2 is a flow chart diagram illustrating a recognition process 100 performed by recognition device 30, according to one embodiment of the invention.
  • recognition device 30 receives the 3D motion data (e.g., the acceleration data of the movement in the X, Y and Z directions) and the sampling rate from input device 20 (step 102).
  • processor 32 calculates the corresponding 3D coordinates on the X, Y and Z axes for each sampling point using the starting point of the movement as the origin (step 106). Each sampling point is also used as a reference point for calculating the coordinates of the following sampling point.
  • Calculation of the 3D coordinates is continuously performed based on the incoming 3D motion data until processor 32 detects receipt of a control signal (step 112).
  • the control signal indicates completion of writing a word or a character.
  • the corresponding 3D tracks are constructed using the 3D coordinates (step 116) and are then mapped onto a 2D plane (step 122). Thereafter, conventional 2D handwriting recognition is performed (step 126).
  • to map the 3D tracks onto a 2D plane at step 122 it is necessary to first find a proper 2D projection plane.
  • a proper 2D projection plane is separately derived for each word or character.
  • a proper 2D projection plane is a plane, to which the sum of the distance square of each sampling point is minimal.
  • n sampling points are known as follows: ( ⁇ l ,y 1 ,z l ),(x 2 ,y 2 ,z 2 )...(x n ,y n ,z n ) , and the equation of the plane is
  • A,B,C,D can be determined using the Lagrange multiplication
  • G(A, B, C, D) F'(A, B, C, D) + ⁇ (A 2 + B 2 + C 2 - 1) ,
  • the following corresponding 2D coordinates are obtained: ⁇ (0.0001 , 0),(0.49,-1),(1 , -2.0001 ),(0,0),(2,-0.13),(3.5001 ,-0.14),(3.7601 , -0.31 ),(2.7401 ,- 1.26),(1.3801 ,2.24),(2.5001 ,-2),(1.746,1 ),(2,-1.5001 ),(1.876,-4.5001 ) ⁇ .
  • the corresponding 2D image is projected onto the 2D projection plane.
  • the result of this Chinese character is shown in FIG. 3A.
  • the final result is generated by performing a conventional 2D handwriting recognition process, as shown in FIG. 3B.
  • the Lagrange multiplication method is used to obtain the values of A, B, C and D.
  • Other methods such as linear regression method may also be used.
  • FIG. 4 shows an external design of a 3D handwriting input device 200 according to one embodiment of the invention.
  • input device 200 includes a housing 210 that contains the electronics parts of the device (such as a 3D motion detection sensor IC chip), a control button 212 for allowing a user to input a control signal to indicate completion of writing a word or character, and a band 220 for mounting input device 200 on the user's finger.
  • the electronics parts of the device such as a 3D motion detection sensor IC chip
  • control button 212 for allowing a user to input a control signal to indicate completion of writing a word or character
  • a band 220 for mounting input device 200 on the user's finger.
  • FIG. 5 illustrates how input device 200 may be mounted.
  • device 200 is mounted on a finger 232 of a user's hand 230. By mounting it on the finger, the user can simply move the finger to write any word or character in a 3D space.
  • the 3D motion data will be wirelessly transmitted to a recognition device for handwriting recognition.
  • the input device and the recognition device can be integrated together as a single unit (e.g., a pen) that operates in the same manner as described above.
  • the final recognition results will be transmitted to an intelligent apparatus such as a PDA, a laptop computer, a PC, etc.
  • the invention can also be used on a 2D plane if the user so chooses. In such a case, the coordinates calculated will be for 2D plane and the step of mapping the 3D tracks onto a 2D plane is omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Character Discrimination (AREA)
  • Storage Device Security (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention provides a 3D handwriting recognition system that allows users to freely write words or characters in a 3D space in a touchless manner, without requiring any physical medium such as a pad or a tablet. The users' handwrting input in a 3D space will be tracked by an input device of the system that generates corresponding 3D motion data and wirelessly transfers the 3D motion data to a recognition device of the system. The 3D motion data will be converted and then mapped onto a 2D plane to generate corresponding 2D images for handwriting recognition. In this way, the users' inputting will never be limited to any screen, pad or plane, and the users will have more flexibility and enjoyable writing experience.

Description

METHOD AND SYSTEM FOR THREE-DIMENSIONAL HANDWRITING
RECOGNITION
BACKGROUND OF THE INVENTION
The invention relates generally to handwriting recognition technologies, and more particularly to three-dimensional (3D) handwriting recognition methods and systems.
Handwriting recognition is a technique by which an intelligence system can recognize characters and other symbols written by hand. This technique has been very popular since it frees users from the keyboard, allowing users to write and draw in a more natural way. With the increasing demands from users, more and more devices now have incorporated handwriting recognition system to give users natural input experience. Handwriting recognition is particularly popular in various handheld devices that provides handwriting recognition of words with complex structures, e.g., Chinese characters, input of which using a conventional keyboard is very time consuming.
A handwriting input device provides a user with a friendly way of inputting information. At present, the minimum requirement for an inputting device is a mouse. To write with the mouse, the user needs to press and hold down the primary mouse button, and then move the mouse pointer to form strokes of a word or character for generating the final word or character. Popular handwriting input devices, such as a pen stylus and tablet, are used on conventional handheld devices such as PDAs, or are connected to a computer through a serial or USB port. Handheld devices often use a pen stylus and a touch screen as a pad to allow users to perform recognition functions. Most handheld devices, such as PDAs, are equipped with this kind of input device.
Another kind of handwriting input device includes a pen that allows users to transfer data into a receiving unit, such as a cellular phone, a PDA or a PC by simply writing or drawing in their own natural handwriting on the comfort and space of a regular piece of paper.
At present, all conventional handwriting input devices adopt a two-dimensional input method. Users have to write on a physical medium, such as a tablet, a touch screen, or a paper pad. This limits the choices available for users. For example, if a user wants to write some comments during a presentation or a lecture, he or she would have to first find a physical medium, e.g., a table, a paper pad, etc. This can cause much inconvenience for the user while standing in the room, giving the presentation or lecture. Also, in a mobile environment, such as in a car, a bus or a subway, it would be very inconvenient to "write" on a physical medium using a stylus. Therefore, there is a need to provide an improved handwriting recognition system that gives users more flexibility and convenience and frees the users from the physical medium required for two-dimensional handwriting recognition.
SUMMARY OF THE INVENTION
The present invention gives users more flexibility and enjoyable writing experience by allowing users to freely write words or characters in a 3D space in a touchless way, without requiring any physical medium such as a pad or a tablet.
According to the invention, there is provided a handwriting recognition system that comprises an input device and a recognition device in communication with the input device. The input device includes a three-dimensional (3D) motion detection sensor configured to generate 3D motion data in response to a 3D motion. In one embodiment, the motion detection sensor measures acceleration of the 3D motion in X, Y and Z axial directions to generate the 3D motion data. The recognition device is configured to receive (e.g., by wireless means) the 3D motion data from the input device and derive corresponding two-dimensional (2D) images for handwriting recognition, based on the 3D motion data.
According to one embodiment of the invention, the recognition device calculates corresponding 3D coordinates based on the 3D motion data, constructs corresponding 3D tracks based the 3D coordinates, and derives the corresponding 2D images from the 3D tracks by mapping the 3D tracks onto a 2D plane for handwriting recognition.
Other objects and attainments together with a fuller understanding of the invention will become apparent and appreciated by referring to the following description and claims taken in conjunction with the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS
The invention is explained in further detail, and by way of example, with reference to the accompanying drawings wherein:
FIG. 1 shows a three-dimensional handwriting recognition system according to one embodiment of the invention;
FIG. 2 is a flow chart diagram illustrating a recognition process according to one embodiment of the invention;
FIG. 3A shows a 2D image of a Chinese character derived by mapping 3D tracks onto a 2D projection plane;
FIG. 3B shows a final result of handwriting recognition process based on the 2D image in FIG. 3A;
FIG. 4 shows an external design of a 3D handwriting input device according to one embodiment of the invention; and
FIG. 5 illustrates how the input device may be mounted.
Throughout the drawings, the same reference numerals indicate similar or corresponding features or functions. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
FIG. 1 shows a three-dimensional handwriting recognition system 10 according to one embodiment of the invention. As illustrated, system 10 comprises a handwriting input device 20, a recognition device 30 and an output device 40. Input device 20 includes a 3D motion detection sensor 22, a control circuit 26 and a communication interface 28. Recognition device 30 includes a processor 32, a memory 34, a storage device 36, and a communication interface 38. For simplicity, other conventional elements are not shown in FIG. 1.
In operation, a user moves input device 20 to freely write words or characters in a 3D space, e.g., in the air. Motion detection sensor 22 detects the 3D motion and communicates the 3D motion data and a sampling rate to recognition device 30 for handwriting recognition via a communication interface 28, such as Bluetooth, Zigbee, IEEE 802.11 , infrared, or a USB port. The sampling rate may be a predetermined value set by an end user or a manufacturer based on factors such as the processing capability of the system. Alternatively, the sampling rate may be dynamically determined and adjusted based on, for example, the speed of the motion. This can be done, for example, by first determining the speed of the initial motion associated with the handwriting. Then the recognition device can dynamically adjust the sampling rate based on the speed of the motion at the last sampling point. The higher the speed is, the higher the sampling rate will be, and vice versa. By dynamically adjusting the sampling rate, the accuracy of the handwriting recognition can be improved since only the optimal number of sampling points will be used for constructing the word or character. Further, lower power consumption is needed.
Based on the motion data and the sampling rate received from input device 20, processor 32 calculates the corresponding 3D coordinates on the X, Y and Z axes and stores these coordinates in storage device 36. Using the calculated coordinates, processor 32 constructs the corresponding 3D tracks. The 3D tracks will then be projected onto a 2D plane to form 2D images which will be recognized using conventional handwriting recognition software. The final result is displayed on output device 40.
Since 3D writing is a continuous process, control circuit 26 of input device 20 provides a control signal to recognition device 30 via interface 28 to indicate separation of individual words or characters upon receiving a user provided external input. For example, the user may press a control button to cause control circuit 26 to generate the control signal after completion of writing a word or a character.
According to a specific embodiment of the invention, motion detection sensor 22 detects the 3D motion by measuring the acceleration of the movement along the X, Y and Z axes. As an example, the piezoresistive-type tri-axial accelerating sensor commercially available from Hitachi Metals, Ltd., Tokyo, Japan, may be used as motion detection sensor 22. This accelerating sensor in the form of an IC chip has the ability to simultaneously detect acceleration in the three axial directions (X, Y and Z). The senor is highly sensitive and shock resistant and is a very small and thin semiconductor type 3 axial accelerating sensor. More information about this accelerating sensor is available on the following website http://www.hit.achi- metals.co.jp/e/prod/prod06/p06_10.html, which is hereby incorporated by reference.
FIG. 2 is a flow chart diagram illustrating a recognition process 100 performed by recognition device 30, according to one embodiment of the invention. In FIG. 2, recognition device 30 receives the 3D motion data (e.g., the acceleration data of the movement in the X, Y and Z directions) and the sampling rate from input device 20 (step 102). Based on the information received, processor 32 calculates the corresponding 3D coordinates on the X, Y and Z axes for each sampling point using the starting point of the movement as the origin (step 106). Each sampling point is also used as a reference point for calculating the coordinates of the following sampling point.
Calculation of the 3D coordinates is continuously performed based on the incoming 3D motion data until processor 32 detects receipt of a control signal (step 112). The control signal indicates completion of writing a word or a character. The corresponding 3D tracks are constructed using the 3D coordinates (step 116) and are then mapped onto a 2D plane (step 122). Thereafter, conventional 2D handwriting recognition is performed (step 126). In the above recognition process 100, to map the 3D tracks onto a 2D plane at step 122, it is necessary to first find a proper 2D projection plane. In a preferred embodiment of the invention, a proper 2D projection plane is separately derived for each word or character.
According to the geometry principles, a proper 2D projection plane is a plane, to which the sum of the distance square of each sampling point is minimal. Assume the coordinates of n sampling points are known as follows: (χl,y1,zl),(x2,y2,z2)...(xn,yn,zn) , and the equation of the plane is
Ax+By+Cz+D = 0, where (A2+B2+C2 ≠O) . Now it is necessary to determine the
values of A,B,C,D . The distance from one point (*., .._..) to the plane is given as
dγ 2 represented by F(A,B,C,D) is
Figure imgf000011_0001
expressed as:
ΠAB C D) -fj2 - +B)i +c^ +D)2 A +B)2 +Cz> +D)2 +-+(^' +By- +c^ +D)2
(=1 AXtf +C2
The values of A,B,C,D can be determined using the Lagrange multiplication
method as follows, which is described in Mathematics Analysis by Ouyang Guangzhong, published by Fudan University Press in 1999 in China, which is hereby incorporated by reference. Thus, F(A,B,C,D) = F'(A,B,C,D) = (Ax1 +By] +Cz1 +D +(Ax2 +By2 +Cz2 +D)X... + (Axn +Byn + Czn +D)2 under the constraint A2 +B2 + C2 = 1. From this equation, the following equation is derived:
G(A, B, C, D) = F'(A, B, C, D) + λ(A2 + B2 + C2 - 1) ,
where λ is a Lagrange multiplier, which is a constant. The partial differential functions are now performed on G(A,B,C,D) with respect to A, B, C and D as follows:
dG(A,B,C,D)
= 0 dA dG(A,B,C,D) = 0 dB dG(A,B,C,D) = 0 dC dG(A,B,C,D) _ = 0 dD
From the above four equations, the following equations are obtained:
Figure imgf000012_0001
' , **.) +
Figure imgf000012_0002
*zl) + λ) +
Figure imgf000012_0003
= 0 (3)
Λ∑lχ +B∑LyχC∑ ϊzl +nD = 0 (4)
A2 +B2 + C2 = 1 (5) where equation (4) can be rewritten as:
Figure imgf000012_0004
By incorporating equation (6) into equations (1), (2), and (3), the following is obtained:
Figure imgf000013_0001
Figure imgf000013_0002
Thus, from the above equations, the values of A, B, C and D can be obtained.
As an example, the following 3D coordinates for a total of 13 sampling points are obtained for a Chinese character:
{(0,0,0),(0.49,-1 ,0.02),(1 ,0,0.03),(0,0,0.02),(2,-0.13,0.01 ),(3.5,-0.14,0),(3.76,-0.31 ,- 0.01 ),(2.74,-1.26,0.01 ),(1.38,-2.24,0),(2.5,-2,0.01 ),(1.746,1 ,0.02,),(2,-1.5,0.03),(1.876,- 4.5,0.02),}.
By using the Lagrange multiplication method described above, the following are obtained:
53.407 - 23.6725 0.1911 " -23.6725 36.2195 - 0.2084 0.1911 - 0.2084 0.0035
Figure imgf000013_0003
^ -i-^ + C^ l . and
D = -^(A∑xχB∑yl + C∑Zl) .
From the above equations, the values of A, B, C and D are determined as follows:
A = 0.0045,5 = 0.0023, C = 0.9999, D = -0.01777.
Thus, the equation of the 2D projection plane is 0.0045* + 0.0023.y + 0.9999z - 0.01777 = 0. From the equation of the projection plane, Ax+By + Cz +D = 0, and the equation of a line that is vertical to the projection plane,
'- = - — — = - , the following equations are derived:
(B2 + C2)x{ - A(Byt + Cz, + D)
A2 +B2 + C2
Figure imgf000014_0001
which can be used to obtain the corresponding 2D coordinates for each 3D sampling point. In this example, the following corresponding 2D coordinates are obtained: {(0.0001 , 0),(0.49,-1),(1 , -2.0001 ),(0,0),(2,-0.13),(3.5001 ,-0.14),(3.7601 , -0.31 ),(2.7401 ,- 1.26),(1.3801 ,2.24),(2.5001 ,-2),(1.746,1 ),(2,-1.5001 ),(1.876,-4.5001 )}. Based on these 2D coordinates, the corresponding 2D image is projected onto the 2D projection plane. The result of this Chinese character is shown in FIG. 3A. The final result is generated by performing a conventional 2D handwriting recognition process, as shown in FIG. 3B.
In the above, the Lagrange multiplication method is used to obtain the values of A, B, C and D. Other methods such as linear regression method may also be used.
FIG. 4 shows an external design of a 3D handwriting input device 200 according to one embodiment of the invention. As shown in FIG. 4, input device 200 includes a housing 210 that contains the electronics parts of the device (such as a 3D motion detection sensor IC chip), a control button 212 for allowing a user to input a control signal to indicate completion of writing a word or character, and a band 220 for mounting input device 200 on the user's finger.
FIG. 5 illustrates how input device 200 may be mounted. In FIG. 5, device 200 is mounted on a finger 232 of a user's hand 230. By mounting it on the finger, the user can simply move the finger to write any word or character in a 3D space. The 3D motion data will be wirelessly transmitted to a recognition device for handwriting recognition.
According to the invention, the input device and the recognition device can be integrated together as a single unit (e.g., a pen) that operates in the same manner as described above. The final recognition results will be transmitted to an intelligent apparatus such as a PDA, a laptop computer, a PC, etc.
The invention can also be used on a 2D plane if the user so chooses. In such a case, the coordinates calculated will be for 2D plane and the step of mapping the 3D tracks onto a 2D plane is omitted.
While the invention has been described in conjunction with specific embodiments, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, it is intended to embrace all such alternatives, modifications and variations as fall within the spirit and scope of the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A handwriting recognition system, comprising:
an input device including a three-dimensional (3D) motion detection sensor that is configured to generate 3D motion data in response to a 3D motion; and
a recognition device, in communication with the input device, that is configured to receive the 3D motion data and derive corresponding two-dimensional (2D) images for handwriting recognition, based on the 3D motion data.
2. The system of claim 1, wherein the recognition device includes means for performing 2D handwriting recognition based on the 2D images.
3. The system of claim 1, wherein the recognition device includes:
means for calculating corresponding 3D coordinates based on the 3D motion data;
means for constructing corresponding 3D tracks based the 3D coordinates; and
means for deriving the corresponding 2D images from the 3D tracks.
4. The system of claim 3, wherein the deriving means includes means for mapping the 3D tracks onto a 2D plane for deriving the 2D images for handwriting recognition.
5. The system of claim 3, wherein the recognition device includes means for performing 2D handwriting recognition based on the 2D images.
6. The system of claim 4, wherein the calculating means calculates the corresponding 3D coordinates of each sampling point based on the 3D motion data and a selected sampling rate.
7. The system of claim 6, wherein the recognition device further includes means for dynamically adjusting the sampling rate based on the speed of the motion.
8. The system of claim 6, wherein the deriving means includes means for deriving the 2D plane as a plane to which the sum of the distance square of each sampling point is minimal.
9. The system of claim 3, wherein the input device further includes a control circuit, responsive a user's command, that is configured to generate a control signal for transmitting to the recognition device to indicate completion of writing a word or a character.
10. The system of claim 3, wherein the motion detection sensor measures acceleration of the 3D motion in X, Y and Z axial directions to generate the 3D motion data.
11. The system of claim 5, further comprising an output device for displaying final results of the handwriting recognition.
12. The system of claim 1, wherein the input device further includes a control circuit, responsive a user's command, that is configured to generate a control signal for transmitting to the recognition device to indicate completion of writing a word or a character.
13. The system of claim 1 , wherein the motion detection sensor measures acceleration of the 3D motion in X, Y and Z axial directions to generate the 3D motion data.
14. The system of claim 1 , wherein the input device wirelessly transmits the
3D motion data to the recognition device.
15. The system of claim 1 , wherein the recognition device includes means for performing 2D handwriting recognition based on the 2D images.
16. A computing system, comprising:
a memory;
an input device including a three-dimensional (3D) motion detection sensor that is configured to generate 3D motion data in response to a 3D motion; and a recognition device, operably coupled to the memory and in communication with the input device, that is configured to receive the 3D motion data and derive corresponding two-dimensional (2D) images for handwriting recognition, based on the 3D motion data.
17. The system of claim 16, wherein the recognition device includes means for performing 2D handwriting recognition based on the 2D images.
18. The system of claim 16, wherein the recognition device includes:
means for calculating corresponding 3D coordinates based on the 3D motion data;
means for constructing corresponding 3D tracks based the 3D coordinates; and
means for deriving the corresponding 2D images from the 3D tracks.
19. The system of claim 18, wherein the deriving means includes means for mapping the 3D tracks onto a 2D plane for deriving the 2D images for handwriting recognition.
20. A handwriting recognition method, comprising the steps of:
generating 3D motion data in response to a 3D motion; and deriving corresponding two-dimensional (2D) images for handwriting recognition based on the 3D motion data.
21. The method of claim 20, further comprising the step of performing 2D handwriting recognition based on the 2D images.
22. The method of claim 20, further comprising the steps of:
calculating corresponding 3D coordinates based on the 3D motion data;
constructing corresponding 3D tracks based the 3D coordinates; and
deriving the corresponding 2D images from the 3D tracks.
23. The method of claim 22, wherein the step of deriving includes mapping the 3D tracks onto a 2D plane for deriving the 2D images for handwriting recognition.
24. The method of claim 22, further comprising a step of performing a 2D handwriting recognition based on the 2D images.
25. The method of claim 23, wherein the corresponding 3D coordinates of each sampling point are calculated based on the 3D motion data and a selected sampling rate.
26. The method of claim 25, further comprising a step of dynamically adjusting the sampling rate based on the speed of the motion.
27. The method of claim 25, wherein the step of deriving further includes the step of deriving the 2D plane as a plane to which the sum of the distance square of each sampling point is minimal.
28. The method of claim 22, further comprising the step of generating an indication, based on a user's command, to indicate completion of writing a word or a character.
29. The method of claim 22, further comprising the step of measuring acceleration of the 3D motion in X, Y and Z axial directions and wherein the 3D motion data are generated based on the acceleration of the 3D motion in the X, Y and Z axial directions.
30. The method of claim 24, further comprising a step of displaying final results of the handwriting recognition.
31. The method of claim 20, further comprising the steps of wirelessly transmitting the 3D motion data and wirelessly receiving the 3D motion data for calculating the 3D coordinates.
32. The method of claim 20, further comprising the step of measuring acceleration of the 3D motion in X, Y and Z axial directions and wherein the 3D motion data are generated based on the acceleration of the 3D motion in the X, Y and Z axial directions.
33. The method of claim 20, further comprising the steps of wirelessly transmitting the 3D motion data and wirelessly receiving the 3D motion data for calculating the 3D coordinates.
34. The method of claim 20, further comprising the step of performing 2D handwriting recognition based on the 2D images.
PCT/IB2003/004102 2002-09-28 2003-09-18 Method and system for three-dimensional handwriting recognition WO2004029866A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2003260877A AU2003260877A1 (en) 2002-09-28 2003-09-18 Method and system for three-dimensional handwriting recognition
JP2004539329A JP2006500680A (en) 2002-09-28 2003-09-18 Method and system for 3D handwriting recognition
US10/528,938 US8150162B2 (en) 2002-09-28 2003-09-18 Method and system for three-dimensional handwriting recognition
EP03798309.5A EP1546993B1 (en) 2002-09-28 2003-09-18 Method and system for three-dimensional handwriting recognition

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN02144248.7 2002-09-28
CNB021442487A CN100377043C (en) 2002-09-28 2002-09-28 Three-dimensional hand-written identification process and system thereof

Publications (1)

Publication Number Publication Date
WO2004029866A1 true WO2004029866A1 (en) 2004-04-08

Family

ID=32034737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2003/004102 WO2004029866A1 (en) 2002-09-28 2003-09-18 Method and system for three-dimensional handwriting recognition

Country Status (6)

Country Link
US (1) US8150162B2 (en)
EP (1) EP1546993B1 (en)
JP (1) JP2006500680A (en)
CN (1) CN100377043C (en)
AU (1) AU2003260877A1 (en)
WO (1) WO2004029866A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004059569A1 (en) * 2002-12-26 2004-07-15 Koninklijke Philips Electronics N.V. Method and system for three-dimentional handwriting recognition
EP1460577A2 (en) * 2003-03-17 2004-09-22 Samsung Electronics Co., Ltd. Motion detection for handwriting recognition
WO2005034023A1 (en) * 2003-09-26 2005-04-14 Ostecs, Inc. Spatial chirographic sign reader
DE102004008253A1 (en) * 2004-02-19 2005-09-15 Siemens Ag Method for estimating a virtual writing level
US7317450B2 (en) 2003-09-26 2008-01-08 Khomo Malome T Spatial chirographic sign reader
CN100405278C (en) * 2005-09-14 2008-07-23 株式会社东芝 Character reader, character reading method, and character reading program
US8036465B2 (en) 2003-09-26 2011-10-11 Khomo Malome T Method of text interaction using chirographic techniques
JP2012208602A (en) * 2011-03-29 2012-10-25 Kyocera Corp Information recognition device
CN104571603A (en) * 2013-10-10 2015-04-29 北京壹人壹本信息科技有限公司 Midair handwriting system and handwriting pen

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4274997B2 (en) * 2004-05-06 2009-06-10 アルパイン株式会社 Operation input device and operation input method
KR100631616B1 (en) * 2005-01-08 2006-10-11 엘지전자 주식회사 3D location information input device
US20090115744A1 (en) * 2007-11-06 2009-05-07 Innovative Material Solutions, Inc. Electronic freeboard writing system
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) * 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8634645B2 (en) * 2008-03-28 2014-01-21 Smart Technologies Ulc Method and tool for recognizing a hand-drawn table
US20090300671A1 (en) * 2008-05-30 2009-12-03 At&T Knowledge Ventures, L.P. Controlling Access to Multimedia Content
CN101751200B (en) * 2008-12-09 2012-01-11 北京三星通信技术研究有限公司 Space input method for mobile terminal and implementation device thereof
US8378967B2 (en) * 2009-02-27 2013-02-19 Denso Corporation Wearable electrical apparatus
JP4683148B2 (en) * 2009-02-27 2011-05-11 株式会社デンソー Electrical equipment
CN102163119A (en) * 2010-02-23 2011-08-24 中兴通讯股份有限公司 Single-hand inputting method and device
US8787663B2 (en) * 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
JP5791131B2 (en) 2010-07-20 2015-10-07 アップル インコーポレイテッド Interactive reality extension for natural interactions
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
JP4877621B1 (en) * 2010-07-30 2012-02-15 カシオ計算機株式会社 Character recognition device and program
CN102346859B (en) * 2010-07-26 2013-10-09 卡西欧计算机株式会社 Character recognition device
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
CN102004623B (en) * 2010-11-29 2013-02-27 深圳市九洲电器有限公司 Three-dimensional image display device and method
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
CN102650905A (en) * 2011-02-23 2012-08-29 西安龙飞软件有限公司 Method utilizing gesture operation in three-dimensional space to realize word input of mobile phone
CN102135821B (en) * 2011-03-08 2013-03-06 中国科学技术大学 Handwriting pen and graphic restoration system
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
CN102915171A (en) * 2011-08-04 2013-02-06 王振兴 Moving trajectory generation method
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
CN102306060B (en) * 2011-08-26 2014-07-09 深圳市优利麦克科技开发有限公司 Input method and system of mobile equipment
CN102262732A (en) * 2011-08-26 2011-11-30 信源通科技(深圳)有限公司 Character recognition method and system
CN102253718B (en) * 2011-08-31 2016-07-06 由田信息技术(上海)有限公司 Three-dimensional hand-written inputting method
CN103076896B (en) * 2011-10-26 2016-08-17 联想(北京)有限公司 A kind of input equipment and terminal and method thereof and a kind of system
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
CN102662465A (en) * 2012-03-26 2012-09-12 北京国铁华晨通信信息技术有限公司 Method and system for inputting visual character based on dynamic track
CN104246682B (en) 2012-03-26 2017-08-25 苹果公司 Enhanced virtual touchpad and touch-screen
CN102810008B (en) * 2012-05-16 2016-01-13 北京捷通华声语音技术有限公司 A kind of air input, method and input collecting device in the air
US9898186B2 (en) * 2012-07-13 2018-02-20 Samsung Electronics Co., Ltd. Portable terminal using touch pen and handwriting input method using the same
CN104714650B (en) * 2015-04-02 2017-11-24 三星电子(中国)研发中心 A kind of data inputting method and device
CN107179839A (en) * 2017-05-23 2017-09-19 三星电子(中国)研发中心 Information output method, device and equipment for terminal
CN107992792A (en) * 2017-10-16 2018-05-04 华南理工大学 A kind of aerial handwritten Chinese character recognition system and method based on acceleration transducer
US10878563B2 (en) * 2018-03-05 2020-12-29 Rion Co., Ltd. Three-dimensional shape data production method and three-dimensional shape data production system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0628096B2 (en) 1984-07-02 1994-04-13 東レ株式会社 Polyester film for magnetic recording media
WO1995021436A1 (en) * 1994-02-04 1995-08-10 Baron Motion Communications, Inc. Improved information input apparatus
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2691101B2 (en) * 1992-03-05 1997-12-17 インターナショナル・ビジネス・マシーンズ・コーポレイション Handwriting input method and input device
JP3450355B2 (en) * 1992-07-07 2003-09-22 株式会社東芝 Spatial handwritten character figure input device and method
JP3456735B2 (en) * 1994-02-21 2003-10-14 沖電気工業株式会社 Information input device and presentation system using the same
EP0697679A3 (en) * 1994-08-12 1998-07-01 Dassault Systemes of America Computerized drawing method
JPH08320756A (en) * 1995-05-24 1996-12-03 Sharp Corp Electronic equipment with data input means
JP3006545B2 (en) * 1997-06-09 2000-02-07 日本電気株式会社 Online character recognition device
US6212296B1 (en) * 1997-12-23 2001-04-03 Ricoh Company, Ltd. Method and apparatus for transforming sensor signals into graphical images
US6456749B1 (en) * 1998-02-27 2002-09-24 Carnegie Mellon University Handheld apparatus for recognition of writing, for remote communication, and for user defined input templates
JP3627791B2 (en) * 1998-08-10 2005-03-09 富士通株式会社 Other terminal operation device
US7707082B1 (en) * 1999-05-25 2010-04-27 Silverbrook Research Pty Ltd Method and system for bill management
US7158675B2 (en) * 2002-05-14 2007-01-02 Microsoft Corporation Interfacing with ink
US7302488B2 (en) * 2002-06-28 2007-11-27 Microsoft Corporation Parental controls customization and notification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0628096B2 (en) 1984-07-02 1994-04-13 東レ株式会社 Polyester film for magnetic recording media
WO1995021436A1 (en) * 1994-02-04 1995-08-10 Baron Motion Communications, Inc. Improved information input apparatus
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004059569A1 (en) * 2002-12-26 2004-07-15 Koninklijke Philips Electronics N.V. Method and system for three-dimentional handwriting recognition
EP1460577A2 (en) * 2003-03-17 2004-09-22 Samsung Electronics Co., Ltd. Motion detection for handwriting recognition
EP1460577A3 (en) * 2003-03-17 2005-12-07 Samsung Electronics Co., Ltd. Motion detection for handwriting recognition
US7580572B2 (en) 2003-03-17 2009-08-25 Samsung Electronics Co., Ltd. Spatial motion recognition system and method using a virtual handwriting plane
WO2005034023A1 (en) * 2003-09-26 2005-04-14 Ostecs, Inc. Spatial chirographic sign reader
US7317450B2 (en) 2003-09-26 2008-01-08 Khomo Malome T Spatial chirographic sign reader
US8036465B2 (en) 2003-09-26 2011-10-11 Khomo Malome T Method of text interaction using chirographic techniques
DE102004008253A1 (en) * 2004-02-19 2005-09-15 Siemens Ag Method for estimating a virtual writing level
CN100405278C (en) * 2005-09-14 2008-07-23 株式会社东芝 Character reader, character reading method, and character reading program
JP2012208602A (en) * 2011-03-29 2012-10-25 Kyocera Corp Information recognition device
CN104571603A (en) * 2013-10-10 2015-04-29 北京壹人壹本信息科技有限公司 Midair handwriting system and handwriting pen
CN104571603B (en) * 2013-10-10 2018-02-27 北京壹人壹本信息科技有限公司 A kind of aerial hand writing system and writing pencil

Also Published As

Publication number Publication date
EP1546993B1 (en) 2015-07-15
AU2003260877A1 (en) 2004-04-19
US8150162B2 (en) 2012-04-03
EP1546993A1 (en) 2005-06-29
CN100377043C (en) 2008-03-26
CN1485711A (en) 2004-03-31
US20060149737A1 (en) 2006-07-06
JP2006500680A (en) 2006-01-05

Similar Documents

Publication Publication Date Title
EP1546993B1 (en) Method and system for three-dimensional handwriting recognition
US20230384867A1 (en) Motion detecting system having multiple sensors
KR100465241B1 (en) Motion recognition system using a imaginary writing plane and method thereof
KR20050085897A (en) Method and system for three-dimentional handwriting recognition
KR100518824B1 (en) Motion recognition system capable of distinguishment a stroke for writing motion and method thereof
KR100630806B1 (en) Command input method using motion recognition device
US20060125789A1 (en) Contactless input device
US10042438B2 (en) Systems and methods for text entry
US20060028457A1 (en) Stylus-Based Computer Input System
EP2703978B1 (en) Apparatus for measuring coordinates and control method thereof
US10082885B2 (en) Information input and output apparatus and information input and output method
KR102051585B1 (en) An electronic device and method having a function of hand writing using multi-touch
CN110378318B (en) Character recognition method and device, computer equipment and storage medium
TWI510966B (en) Input system and related method for an electronic device
CN112241491A (en) Recommended word display method and terminal equipment
KR20100075282A (en) Wireless apparatus and method for space touch sensing and screen apparatus using depth sensor
KR100736249B1 (en) Data inputting, transmitting and receiving device
CN110795016A (en) Display method and electronic equipment
JP6523509B1 (en) Game program, method, and information processing apparatus
JP2009151631A (en) Information processor, information processing method, and program
CN116246335A (en) Method of tracking augmented reality input gestures and system using the same
CN204731780U (en) Based on retouching the mobile translation device drawn in the air
KR20230105601A (en) Electronic apparatus and control method thereof
KR101564498B1 (en) External support system for mobile devices
JP2012155480A (en) Input device for portable information apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003798309

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2004539329

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 2003798309

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2006149737

Country of ref document: US

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 10528938

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 10528938

Country of ref document: US