CN102722240A - Text information input system, handwriting input device and text information input method - Google Patents

Text information input system, handwriting input device and text information input method Download PDF

Info

Publication number
CN102722240A
CN102722240A CN2012101553224A CN201210155322A CN102722240A CN 102722240 A CN102722240 A CN 102722240A CN 2012101553224 A CN2012101553224 A CN 2012101553224A CN 201210155322 A CN201210155322 A CN 201210155322A CN 102722240 A CN102722240 A CN 102722240A
Authority
CN
China
Prior art keywords
signal
action
mems sensor
information
writing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012101553224A
Other languages
Chinese (zh)
Inventor
程如中
王新安
闫桂珍
魏益群
赵勇
徐军帅
郭中洋
申凌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PKU-HKUST Shenzhen-Hongkong Institution
Peking University Shenzhen Graduate School
Original Assignee
Peking University Shenzhen Graduate School
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University Shenzhen Graduate School filed Critical Peking University Shenzhen Graduate School
Priority to CN2012101553224A priority Critical patent/CN102722240A/en
Publication of CN102722240A publication Critical patent/CN102722240A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Character Discrimination (AREA)

Abstract

The invention discloses a text information input system, a handwriting input device and a text information input method, wherein the system includes: a micro electromechanical system sensor for sensing action information of writing text of a user and outputting corresponding signals according to the action information, the signals including original space vector group data which reflect the action information, the original space vector group data including spatial position information, direction information and writing time information of an action; and a recognition processor for receiving and analyzing signals of the micro electromechanical system sensor and mapping the signals into inputted text. The text information input system, the handwriting input device and the text information input method in the invention, sensing a hand action through the sensor, can not only replace other information input methods in a certain range, but also can enable a handheld device or a computer to be more human-friendly and promote innovation of a novel man-machine interaction mode.

Description

A kind of Word message input system, handwriting input device and method
Technical field
The application relates to a kind of input media, relates in particular to a kind of Word message input system and method.
Background technology
Traditional Word message input system; For example computing machine and handheld terminal (like mobile phone); Normally utilize electronic components such as keyboard or touch-screen to receive user's input; And be compiled into relevant signal, show by the display screen of computing machine or handheld terminal, thereby realized literal, numeral and the order input etc. of hand-written or keyboard.
No matter keyboard or touch-screen all are to combine with host computer or handheld terminal usually, and when importing, user's eyes need keep a close watch on keyboard constantly or screen is imported, and can't let user's eyes free from keyboard or touch-screen.Simultaneously this traditional Word message input technology all is confined to the user before the display screen of computing machine or handheld terminal inevitably.
For this reason, existing at present research staff is devoted to study the system that can in various occasions and space, arbitrarily carry out the information input, but much all is based on the Flame Image Process under the vision, needs complicated algorithm.
Summary of the invention
The application provides a kind of Word message input system and method, can realize not having the input of screen Word message in any occasion, enriches user's writing experience.
An aspect according to the application provides a kind of Word message input system, comprising:
MEMS sensor; Be used to respond to the action message of user's writing words; Export corresponding signal according to action message; Said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action;
Recognition processor is used to receive and analyze the signal of said MEMS sensor, with the literal of said signal map for input.
Said recognition processor comprises: receiver module is used to receive the signal of said MEMS sensor; Vector generation module is used for the signal that receives is generated as discernible polar plot character and graphic; Matching module is used to mate said polar plot character and graphic, is converted into Word message.
Said MEMS sensor is installed in can be worn on the equipment or handheld terminal on user's the finger.Said literal comprises Chinese and/or Japanese and/or English and/or numeral and/or symbol.
According to another aspect of the application, a kind of Word message input method is provided, comprising:
Use the action message of MEMS sensor induction user writing words; Export corresponding signal according to action message; Said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action;
Analyze said signal and with said signal map for the input literal.
Described method also comprises: realize the control order relevant with writing words through the different actions order is set in the different time limit.
Saidly export corresponding signal according to action message and comprise: coordinate system of said MEMS sensor initialization; And acceleration of motion and angular velocity when measuring user's writing words according to action message; Calculate the luv space set of vectors data of user's writing words; Luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action, can further include the action dynamics of user's writing words.
The said signal of said analysis and with said signal map for the input literal comprise: analyze said luv space set of vectors data, generate discernible polar plot character and graphic, mate said polar plot character and graphic, be converted into Word message.
Said literal comprises Chinese and/or Japanese and/or English and/or numeral and/or symbol.
Disclosed herein as well is a kind of handwriting input device; Comprise carrier and MEMS sensor; Said MEMS sensor is attached on the carrier, and said MEMS sensor is used to respond to the action message of user's writing words, exports corresponding signal according to action message; Said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action.
Among a kind of embodiment, said carrier is for can wear finger ring or the handheld terminal on user's finger, and said MEMS sensor is installed in the inside or the surface of finger ring, special construction lettering pen or handheld terminal.
The application's beneficial effect is: through the sensor sensing gesture motion, not only can go up replacement out of Memory input mode within the specific limits, and make the use of handheld device or computing machine have more hommization, drive the innovation of man-machine interaction new model.
Description of drawings
Fig. 1 is the structural representation of the Word message input system of a kind of embodiment of the application;
Fig. 2 is the synoptic diagram of use of the Word message input system of a kind of embodiment of the application;
Fig. 3 is the synoptic diagram of use of the Word message input system of the another kind of embodiment of the application;
Fig. 4 is the synoptic diagram of the basic gesture element of the Word message input system of a kind of embodiment of the application when being applied to stroke writing and constituting literal;
Fig. 5 is the synoptic diagram that the Word message input system of a kind of embodiment of the application realizes activating or controlling the mode of order;
Fig. 6 a is the schematic flow sheet of the Word message input method of a kind of embodiment of the application;
Fig. 6 b is the schematic flow sheet that goes out literal in a kind of embodiment of the application according to luv space set of vectors Data Matching;
Fig. 7 is the process flow diagram of the Word message input method of a kind of embodiment of the application;
Fig. 8 is the schematic flow sheet of processor in the Word message input method of a kind of embodiment of the application.
Embodiment
Combine accompanying drawing that the application is done further explain through embodiment below.
The design philosophy of the application's embodiment is: through MEMS (Micro Electro Mechanical System; MEMS) sensor or the gesture equipment of MEMS sensor is housed; The induction gesture motion is also exported corresponding signal, through the algorithm identified related words information and be stored on the corresponding apparatus.Thereby, can realize not having the input of screen Word message in any occasion, not only can replace the out of Memory input mode within the specific limits, and make the use of computing machine or handheld device have more hommization, enriched user's writing experience.
As shown in Figure 1, in a kind of embodiment of the application, the Word message input system comprises: MEMS sensor and recognition processor.MEMS sensor is used to respond to the action message of user's writing words, exports corresponding signal according to action message; Recognition processor is used to receive and analyze the signal of said MEMS sensor, with the literal of said signal map for input.Wherein, recognition processor comprises: receiver module is used to receive the signal that said MEMS sensor is exported; Vector generation module is used for the signal that receives is generated as discernible polar plot character and graphic; Matching module is used to mate said polar plot character and graphic, is converted into Word message.
Wherein, The MEMS sensor comprises MEMS micro-acceleration gauge and gyroscope; Can adopt any can the paratonic movement acceleration and the micro-acceleration gauge of direction; And any can the aware space corner and the gyroscope of corner acceleration, comprise three or diaxon micro-acceleration gauge, three or diaxon gyroscope etc.Electric signal is the signal according to the testing result output of micro-acceleration gauge and gyroscope; This signal has directly reflected information such as the direction of motion of pointing, locus, writing time; Can be used in the different strokes of analyzing constituent parts action representative, the pairing Chinese character of each action group or other symbol.That is to say, perceive the acceleration and the angular velocity of current action of writing through micro-acceleration gauge and gyroscope, can completing place information and the input of direction of motion information through acceleration and angular velocity.These positional informations and direction of motion information can be formed the stroke element of font, as long as promptly remember the position of the first stroke of a Chinese character, so, can remember other key positions and stroke element in the writing process, thus the input of completion literal.
For making things convenient for user's input, in one embodiment, a kind of handwriting input device is disclosed; Comprise carrier and MEMS sensor; MEMS sensor is attached on the carrier, and MEMS sensor is used to respond to the action message of user's writing words, exports corresponding signal according to action message; Signal comprises the luv space set of vectors data that reflect action message, and luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action.
Said MEMS sensor comprises micro-acceleration gauge, gyroscope and signal generation unit; Said micro-acceleration gauge and gyroscope are connected respectively to the signal generation unit; Micro-acceleration gauge is used for telekinetic acceleration and direction and outputs to the signal generation unit; Gyroscope is used for telekinetic space corner and corner acceleration, coordinate system of initialization and outputs to the signal generation unit; Said signal generation unit receives the output of micro-acceleration gauge and gyroscope, generates initialization coordinate system signal and the luv space set of vectors data corresponding with each action of writing.
Among a kind of embodiment; The MEMS sensor is installed in can be worn on the equipment on user's the finger, as shown in Figure 2, wears the gesture equipment (for example finger ring 100) that the MEMS sensor is housed on user's the finger tip position; Finger ring 100 can be that ring-type maybe can form the band shape that ring-type is worn; MEMS sensor is installed in the inside or the surface of finger ring, can directly install, and also can install through a housing; For example MEMS sensor is installed in the housing, and housing is fixed on the inside or the surface of finger ring.Finger is worn the action that handwriting input device is done writing words; Cause the motion of sensor senses finger; And export corresponding electric signal; Be transferred to the recognition processor that is positioned on computing machine or the handheld device (like mobile phone, palm PC etc.) through wired or wireless mode then and analyze,, the literal that identifies is stored on this equipment through signal Processing and pattern-recognition.When the MEMS sensor will be transferred to according to the electric signal that user's motion generates when being positioned at computing machine or handheld device through wireless mode; Handwriting input device also comprises data transmission module; Said data transmission module is a wireless communication module; The electric signal that is used for the MEMS sensor is generated generates the packet that meets wireless transmission protocol, sends to receiving end (for example computing machine, mobile phone or palm PC etc.) through Wireless transmission mode, and Wireless transmission mode can be WIFI communication; Perhaps Bluetooth communication, perhaps infrared communication or 3G communication etc.Concrete transducer sensitivity can be by those skilled in the art according to settings such as sensor parameters, and the application embodiment does not limit this.The advantage of this embodiment is, the perception end that contains sensor is worn conveniently, directly wears to finger tip to get final product, and do not need extra electrode.Among this embodiment, electric signal output back is carrying out carrying out signal Processing and pattern-recognition again after the A/D conversion on the receiving end (just computing machine or handheld device); Among other embodiment, can carry out A/D conversion earlier in the MEMS sensor side, electrical signal conversion is digital signal after, redispatch to receiving end, carry out signal Processing and pattern-recognition to identify literal by receiving end.
Among the another kind of embodiment, carrier is a handheld terminal 200, and MEMS sensor is installed in the inside or the surface of handheld terminal.Handheld terminal can be made into different shape, and handheld terminal also can be the lettering pen of mouse or special construction.As shown in Figure 3, at this moment, the user holds this handheld terminal, through the MEMS sensor senses gesture motion in the handheld terminal, is transferred to the recognition processor that is positioned on computing machine or other handheld devices through wireless mode then and analyzes.In a kind of distortion of this embodiment; Handwriting input device also comprises processor; MEMS sensor is connected with processor, and the electric signal that generates is exported to processor, for example can the recognition processor in the system be arranged in the handheld terminal that the user holds; Directly, the Word message that identifies is sent to computing machine or handheld device (like mobile phone, palm PC etc.) through Wireless transmission mode by the style of writing word recognition of going forward side by side of the processor analysis in this handheld terminal.
Among the embodiment, the MEMS sensor can be determined the position of writing words, and position signalling is sent to the processor side, generates discernible vector by the processor side through the inertial navigation algorithm, accomplishes vector matching through algorithm for pattern recognition then.Be appreciated that those skilled in the art can adopt any inertial navigation algorithm of the prior art and algorithm for pattern recognition to discern literal, the application does not limit this.
As shown in Figure 4; A kind of synoptic diagram that uses the basic gesture element of the application's Word message input system when stroke writing constitutes literal; Particularly to constituting literal such as strokes such as Chinese character, Japaneses; The several basic gesture elements of required identification are distinguished " horizontal, vertical, left-falling stroke, right-falling stroke, left crotch, right crotch, roll over, propose " several basic strokes of corresponding Chinese character from left to right, from top to bottom.Receiving end is according to the record of these gesture motion space vector sequences, and then can realize the identification of corresponding unit Chinese character through the pattern-recognition scheduling algorithm.
For realizing the relevant order of controlling, can realize the relevant order of controlling through different actions order in the different time limit, send order such as " input begins or finishes ".As shown in Figure 5, the 501st, realize activating or controlling the mode of order through the action action of writing plane (for example perpendicular to) longitudinally.Certainly, switch that also can be through hardware or control of software operation decide and start or close and write input function.502 represented gesture informations are " contrary horizontal ", can realize delete function in input process, promptly previous character such as clerical error, the determined by fate deletion of available 502 strokes; Same 503 is " contrary perpendicular ", can realize switching to the function of next document; 504 can realize getting back to the function of first piece of storage document, are equivalent to get back to initial document, are convenient under virtual input condition, need not see that screen also can confirm to belong to the position of document; Write under the situation of loading routine with system start-up blind, 505 can be used as the virtual input of activation, if program not with system start-up, promptly not in the system process of mobile device, can pass through to get into system interface, normally opening program activates virtual input state; The order of virtual input of 506 represented end and file.The customized justice of gesture command that mentioned order is corresponding.
The Word message input system that the application's embodiment provides is with the essential distinction of various input systems in the past; Traditional keyboard or screen input method need entity screen or keyboard trick also with carrying out corresponding input; The application's embodiment then breaks away from screen and keyboard, under various environment, only realizes " not having the screen input " through gesture motion, when hands movement; Hand can be in any spatial movement; Need be limited in certain location, therefore need not pay close attention to the action of hand, can eyes be freed from screen or keyboard with eyes by screen or keyboard.
Based on above-mentioned Word message input system, the application's embodiment also provides has realized a kind of Word message input method, shown in Fig. 6 a, may further comprise the steps:
Use the action message of MEMS sensor sensing user writing words, export corresponding signal according to action message.After getting into write mode; The MEMS sensor is followed user's finger motion; Telekinetic acceleration and angular velocity through calculating the luv space set of vectors data of corresponding each motion, comprise spatial positional information, directional information and the writing time information of action at least.
The signal of analyzing the output of MEMS sensor is also with the literal of this signal map for input.Analyze said luv space set of vectors data, generate discernible polar plot character and graphic, mate said polar plot character and graphic, be converted into Word message.
In one embodiment, analyze the output of MEMS sensor signal and with this signal map for the flow process of the literal of input shown in Fig. 6 b, specifically comprise:
Step S601 analyzes said luv space set of vectors data, according to positional information and the directional information in the luv space set of vectors data, generates the polar plot of the stroke corresponding with this action.
Step S602 according to the writing time information in the luv space set of vectors data, successively is arranged in stroke sequence with the stroke that generates by writing time, and calculates the interval time of adjacent two strokes in the stroke sequence.
Step S603 compares interval time and setting threshold, when interval time during less than setting threshold; Execution in step S604; Stroke is divided into a combination of strokes, continues to judging the interval time of writing, if interval time is still less than setting threshold; Then will this stroke put this combination of strokes under, up to the situation of interval time occurring more than or equal to setting threshold.When interval time during more than or equal to setting threshold; Think that these adjacent two strokes belong to different literal; Should stroke be put under in the different combination of strokes, then execution in step S605 finishes last combination of strokes; Begin a new combination of strokes, this stroke is put under in the new combination of strokes different with a last stroke.If promptly adjacent two strokes write interval time less than setting threshold, then these strokes all belong to same combination of strokes, belong to the stroke of a literal.Write interval time during when what detect certain adjacent two stroke, think the beginning of another one literal more than or equal to setting threshold.The default setting of setting threshold in can the employing program also can be set according to user's writing style.
Step S606 matches corresponding character for completed combination of strokes in literal pool.According to the stroke that comprises in this combination of strokes, adopt matching algorithm to match literal during coupling, the corresponding literal of combination of strokes.Can adopt existing image matching algorithm to carry out aid identification, carry out verification to improve accuracy rate.
Infer the literal that institute will express at last, and set up single the mapping or many mapping relations with corresponding literal.
For the method for this handwriting input further is described, Fig. 7 and Fig. 8 also show a more detailed embodiment, and wherein, Fig. 7 is the flow process of Word message input method, and Fig. 8 shows the flow process that receiving end receives signal and is mapped as literal.
As shown in Figure 7, be example to import a Chinese character, a Chinese character comprises some strokes (being called group of strokes), comprises the steps:
Step S701 uses the MEMS sensor to carry out work such as initialization coordinate system;
In theory; Describe the motion of an object, at first will set up coordinate system, present embodiment is accomplished by the signal that micro-acceleration gauge sends; Certain special action makes accelerometer send commencing signal; Be true origin with the point that sends this signal promptly, according to starting condition computing velocity, acceleration and locus etc., this is accomplished by micro-acceleration gauge and gyroscope jointly then.Here, special action is the relevant action of controlling order of aforementioned realization.
Step S702 through the relevant method of controlling order of aforementioned realization, confirms the current input that whether finishes as realize the mode that activates or control order through action longitudinally, if, then finish input, and the output relevant information, if not, then change step S703.
Step S703, the induction action message.Action message during through MEMS sensor sensing user writing words, wherein, concrete transducer sensitivity can be by those skilled in the art according to settings such as sensor parameters, and the application embodiment does not limit this.
Step S704, signal Processing.
As aforementioned; Among a kind of embodiment; The electric signal that the MEMS sensor sensing obtains can carry out the A/D conversion in the MEMS sensor side; Signal after will changing then sends to receiving end (being the processor side), carries out signal Processing according to the inertial navigation algorithm to the received signal by the processor side, generates discernible vector.Among the another kind of embodiment, behind the electric signal that the MEMS sensor sensing obtains, directly electric signal is sent to the processor side, the electric signal that receives is carried out signal Processing such as A/D conversion by processor, confirm key position in the writing process, generate discernible vector.
Step S705, these two steps of S706 are that discernible vector is carried out unit gesture stroke coupling and group of strokes coupling, are literal with the key position Parameters Transformation.Can adopt existing algorithm for pattern recognition to generate Word message (being step S707), the application does not limit this.
Step S708 finishes unit literal input.
Fig. 8 is receiving end (being the processor side) reception signal and the flow process that is mapped as literal, comprises the steps:
Step S801 receives the MEMS sensor input signal, can confirm the current input signal that has the MEMS sensor through hardware or software trigger, confirms the current literal that needs input;
Step S802, equipment end activates the text input state; That is to say that step S801 and S802 are equivalent to an initialization procedure, wake receiving end up and prepare to receive data;
Step S803 receives the MEMS sensing data;
Step S804, S805, S806; These three steps are accomplished by processor; By its space vector group that analyze to receive, generate discernible polar plot character and graphic and deposit equipment end in, depositing equipment end in after perhaps converting vector graphics to reproducible Word message.
So far, the Word message input is accomplished.
Certainly, according to the above-mentioned steps of Fig. 7 and Fig. 8, under spirit that does not break away from the application and principle, those skilled in the art can simply change, increase or delete step.For example, can the step of analyzing with identification signal all be put on the peripheral hardware of perception action of writing, in peripheral hardware, literal be identified, and then send to equipment end and store and wait other operations.
Need pass through video information than traditional input method; The serious screen that relies on; The application's Word message input system and method are not through video information, but through carry out literal identification based on the gesture motion information of MEMS, can effectively break away from the virtual input that screen is realized information under the various occasions; Realize not having the input of screen Word message; Not only can go up replacement out of Memory input mode within the specific limits, make the use of computing machine or handheld device have more hommization, also drive the innovation of man-machine interaction new model.And among a kind of embodiment, the perception end that contains the MEMS sensor can directly be worn to finger tip, and its wearing mode is not only convenient, and does not need extra electrode.
Above content is to combine concrete embodiment to the further explain that the application did, and can not assert that the application's practical implementation is confined to these explanations.For the those of ordinary skill of technical field under the application, under the prerequisite that does not break away from the application's design, can also make some simple deduction or replace.

Claims (10)

1. a Word message input system is characterized in that, comprising:
MEMS sensor; Be used to respond to the action message of user's writing words; Export corresponding signal according to action message; Said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action;
Recognition processor is used to receive and analyze the signal of said MEMS sensor, with the literal of said signal map for input.
2. the system of claim 1; It is characterized in that; Said MEMS sensor comprises micro-acceleration gauge, gyroscope and signal generation unit; Said micro-acceleration gauge and gyroscope are connected respectively to the signal generation unit, and micro-acceleration gauge is used for telekinetic acceleration and direction and outputs to the signal generation unit, and gyroscope is used for telekinetic space corner and corner acceleration, coordinate system of initialization and outputs to the signal generation unit; Said signal generation unit receives the output of micro-acceleration gauge and gyroscope, generates corresponding signal.
3. according to claim 1 or claim 2 system is characterized in that said recognition processor comprises:
Receiver module is used to receive the signal of said MEMS sensor;
Vector generation module is used for the signal that receives is generated as discernible polar plot character and graphic;
Matching module is used to mate said polar plot character and graphic, is converted into Word message.
4. handwriting input device; It is characterized in that comprise carrier and MEMS sensor, said MEMS sensor is attached on the carrier; Said MEMS sensor is used to respond to the action message of user's writing words; Export corresponding signal according to action message, said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action.
5. handwriting input device as claimed in claim 4 is characterized in that, said carrier is for can wear finger ring or the handheld terminal on user's finger, and said MEMS sensor is installed in the inside or the surface of finger ring or handheld terminal.
6. like claim 4 or 5 described handwriting input device; It is characterized in that; Said MEMS sensor comprises micro-acceleration gauge, gyroscope and signal generation unit; Said micro-acceleration gauge and gyroscope are connected respectively to the signal generation unit, and micro-acceleration gauge is used for telekinetic acceleration and direction and outputs to the signal generation unit, and gyroscope is used for telekinetic space corner and corner acceleration, coordinate system of initialization and outputs to the signal generation unit; Said signal generation unit receives the output of micro-acceleration gauge and gyroscope, generates corresponding signal.
7. a Word message input method is characterized in that, comprising:
Use the action message of MEMS sensor induction user writing words; Export corresponding signal according to action message; Said signal comprises the luv space set of vectors data that reflect action message, and said luv space set of vectors data comprise spatial positional information, directional information and the writing time information of action;
Analyze said signal and with said signal map for the input literal.
8. method as claimed in claim 7 is characterized in that, also comprises: realize the control order relevant with writing words through the different actions order is set in the different time limit.
9. like claim 7 or 8 described methods; It is characterized in that; Saidly export corresponding signal according to action message and comprise: coordinate system of said MEMS sensor initialization; And acceleration of motion and angular velocity when measuring user's writing words according to action message, calculate the luv space set of vectors data of user's writing words; The said signal of said analysis and with said signal map for the input literal comprise: analyze said luv space set of vectors data, generate discernible polar plot character and graphic, mate said polar plot character and graphic, be converted into Word message.
10. method as claimed in claim 9 is characterized in that, the said signal of said analysis and with said signal map for the input literal specifically comprise:
Analyze said luv space set of vectors data, generate the stroke corresponding with this action;
According to the writing time information in the luv space set of vectors data, the stroke that generates successively is arranged in stroke sequence by writing time, and calculates the interval time of adjacent two strokes in the stroke sequence;
Interval time and setting threshold are compared, be divided into a combination of strokes less than the stroke of setting threshold interval time;
In literal pool, match corresponding character according to the stroke that comprises in the combination of strokes for combination of strokes.
CN2012101553224A 2011-05-18 2012-05-18 Text information input system, handwriting input device and text information input method Pending CN102722240A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012101553224A CN102722240A (en) 2011-05-18 2012-05-18 Text information input system, handwriting input device and text information input method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201110128829.6 2011-05-18
CN201110128829 2011-05-18
CN2012101553224A CN102722240A (en) 2011-05-18 2012-05-18 Text information input system, handwriting input device and text information input method

Publications (1)

Publication Number Publication Date
CN102722240A true CN102722240A (en) 2012-10-10

Family

ID=46948035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012101553224A Pending CN102722240A (en) 2011-05-18 2012-05-18 Text information input system, handwriting input device and text information input method

Country Status (1)

Country Link
CN (1) CN102722240A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982187A (en) * 2013-01-04 2013-03-20 深圳市中兴移动通信有限公司 Lookup method and lookup system based on somatosensory identification character index
CN103425262A (en) * 2013-08-01 2013-12-04 广东小天才科技有限公司 Chinese character handwriting input method and device
CN103870185A (en) * 2012-12-17 2014-06-18 鸿富锦精密工业(武汉)有限公司 Character input system and method
CN103995585A (en) * 2014-05-04 2014-08-20 广东威创视讯科技股份有限公司 Gesture control device and method for display big wall
CN104793747A (en) * 2015-04-24 2015-07-22 百度在线网络技术(北京)有限公司 Method, device and system for inputting through wearable device
WO2016184076A1 (en) * 2015-10-19 2016-11-24 中兴通讯股份有限公司 Character output method and apparatus
CN106648148A (en) * 2016-12-14 2017-05-10 天津映之科技有限公司 Fingertip wearing type operation terminal used for man-machine interaction in field of computers
CN106908642A (en) * 2015-12-23 2017-06-30 苏州普源精电科技有限公司 A kind of probe, oscillograph, movement recognition system and method
CN107292084A (en) * 2017-05-09 2017-10-24 浙江大学 A kind of automatic transfer method of medicine prescription
CN107643840A (en) * 2017-09-21 2018-01-30 上海电机学院 Writing pencil word automatic identifying method and system based on inertia device
CN108089727A (en) * 2016-06-12 2018-05-29 苹果公司 For the touch keypad of screen
CN112328156A (en) * 2020-11-12 2021-02-05 维沃移动通信有限公司 Input device control method and device and electronic device
US11016658B2 (en) 2013-06-09 2021-05-25 Apple Inc. Managing real-time handwriting recognition
US11112968B2 (en) 2007-01-05 2021-09-07 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
CN113900515A (en) * 2021-09-26 2022-01-07 华北科技学院(中国煤矿安全技术培训中心) Information input method and system based on action perception and intelligent terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101229432A (en) * 2007-10-26 2008-07-30 北京大学 Method of controlling action emulation and system thereof
CN101276249A (en) * 2007-03-30 2008-10-01 北京三星通信技术研究有限公司 Method and device for forecasting and discriminating hand-written characters
CN101692197A (en) * 2009-09-04 2010-04-07 上海交通大学 MEMS accelerometer based intelligent handwriting system and identification method thereof
US20100090815A1 (en) * 2006-12-25 2010-04-15 Konica Minolta Holdings, Inc. Handwriting electronic input system
CN102004843A (en) * 2010-09-07 2011-04-06 哈尔滨工业大学 Handheld control roaming system with pattern recognition function
WO2011057287A1 (en) * 2009-11-09 2011-05-12 Invensense, Inc. Handheld computer systems and techniques for character and command recognition related to human movements

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100090815A1 (en) * 2006-12-25 2010-04-15 Konica Minolta Holdings, Inc. Handwriting electronic input system
CN101276249A (en) * 2007-03-30 2008-10-01 北京三星通信技术研究有限公司 Method and device for forecasting and discriminating hand-written characters
CN101229432A (en) * 2007-10-26 2008-07-30 北京大学 Method of controlling action emulation and system thereof
CN101692197A (en) * 2009-09-04 2010-04-07 上海交通大学 MEMS accelerometer based intelligent handwriting system and identification method thereof
WO2011057287A1 (en) * 2009-11-09 2011-05-12 Invensense, Inc. Handheld computer systems and techniques for character and command recognition related to human movements
CN102004843A (en) * 2010-09-07 2011-04-06 哈尔滨工业大学 Handheld control roaming system with pattern recognition function

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11112968B2 (en) 2007-01-05 2021-09-07 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US11416141B2 (en) 2007-01-05 2022-08-16 Apple Inc. Method, system, and graphical user interface for providing word recommendations
CN103870185A (en) * 2012-12-17 2014-06-18 鸿富锦精密工业(武汉)有限公司 Character input system and method
CN102982187B (en) * 2013-01-04 2016-08-03 努比亚技术有限公司 A kind of lookup method based on somatosensory recognition character index and system
CN102982187A (en) * 2013-01-04 2013-03-20 深圳市中兴移动通信有限公司 Lookup method and lookup system based on somatosensory identification character index
US11182069B2 (en) 2013-06-09 2021-11-23 Apple Inc. Managing real-time handwriting recognition
US11016658B2 (en) 2013-06-09 2021-05-25 Apple Inc. Managing real-time handwriting recognition
US11816326B2 (en) 2013-06-09 2023-11-14 Apple Inc. Managing real-time handwriting recognition
CN103425262A (en) * 2013-08-01 2013-12-04 广东小天才科技有限公司 Chinese character handwriting input method and device
CN103995585A (en) * 2014-05-04 2014-08-20 广东威创视讯科技股份有限公司 Gesture control device and method for display big wall
CN104793747A (en) * 2015-04-24 2015-07-22 百度在线网络技术(北京)有限公司 Method, device and system for inputting through wearable device
WO2016184076A1 (en) * 2015-10-19 2016-11-24 中兴通讯股份有限公司 Character output method and apparatus
CN106598447A (en) * 2015-10-19 2017-04-26 中兴通讯股份有限公司 Character output method and device
CN106908642A (en) * 2015-12-23 2017-06-30 苏州普源精电科技有限公司 A kind of probe, oscillograph, movement recognition system and method
US10884617B2 (en) 2016-06-12 2021-01-05 Apple Inc. Handwriting keyboard for screens
CN108089727B (en) * 2016-06-12 2021-05-18 苹果公司 Handwriting keyboard for screen
CN108089727A (en) * 2016-06-12 2018-05-29 苹果公司 For the touch keypad of screen
US11941243B2 (en) 2016-06-12 2024-03-26 Apple Inc. Handwriting keyboard for screens
US11640237B2 (en) 2016-06-12 2023-05-02 Apple Inc. Handwriting keyboard for screens
CN106648148A (en) * 2016-12-14 2017-05-10 天津映之科技有限公司 Fingertip wearing type operation terminal used for man-machine interaction in field of computers
CN107292084A (en) * 2017-05-09 2017-10-24 浙江大学 A kind of automatic transfer method of medicine prescription
CN107292084B (en) * 2017-05-09 2020-12-08 浙江大学 Automatic medical prescription transferring method
CN107643840A (en) * 2017-09-21 2018-01-30 上海电机学院 Writing pencil word automatic identifying method and system based on inertia device
US11620046B2 (en) 2019-06-01 2023-04-04 Apple Inc. Keyboard management user interfaces
US11842044B2 (en) 2019-06-01 2023-12-12 Apple Inc. Keyboard management user interfaces
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
CN112328156A (en) * 2020-11-12 2021-02-05 维沃移动通信有限公司 Input device control method and device and electronic device
CN112328156B (en) * 2020-11-12 2022-05-17 维沃移动通信有限公司 Input device control method and device and electronic device
CN113900515A (en) * 2021-09-26 2022-01-07 华北科技学院(中国煤矿安全技术培训中心) Information input method and system based on action perception and intelligent terminal

Similar Documents

Publication Publication Date Title
CN102722240A (en) Text information input system, handwriting input device and text information input method
US11327577B2 (en) Multi-function stylus with sensor controller
US10042438B2 (en) Systems and methods for text entry
KR100630806B1 (en) Command input method using motion recognition device
KR102120930B1 (en) User input method of portable device and the portable device enabling the method
WO2017215375A1 (en) Information input device and method
EP4239456A1 (en) Method and glasses type wearable device for providing a virtual input interface
JP5930618B2 (en) Spatial handwriting system and electronic pen
US20140354553A1 (en) Automatically switching touch input modes
CN104049737A (en) Object control method and apparatus of user device
US20140365878A1 (en) Shape writing ink trace prediction
JP2013125487A (en) Space hand-writing system and electronic pen
US20150234524A1 (en) Method and apparatus for handwriting input
US20130321351A1 (en) Graphical display with optical pen input
CN104571521B (en) Hand-written recording equipment and hand-written recording method
CN107272892B (en) Virtual touch system, method and device
KR102051585B1 (en) An electronic device and method having a function of hand writing using multi-touch
KR20160143428A (en) Pen terminal and method for controlling the same
EP2772833B1 (en) System and method of determining stylus location on touch-sensitive display
KR20150063699A (en) Smart Pen
WO2014084634A1 (en) Mouse apparatus for eye-glass type display device, and method for driving same
CN108463788B (en) Active pen and gesture detection method based on same
US20130241844A1 (en) Method of Touch Command Integration and Touch System Using the Same
US10082885B2 (en) Information input and output apparatus and information input and output method
TW201342134A (en) Electronic device having handwriting input function

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: SHENZHEN GRADUATE SCHOOL OF PEKING UNIVERSITY

Effective date: 20150731

Owner name: SHENZHEN-HONG KONG INSTITUTION

Free format text: FORMER OWNER: SHENZHEN GRADUATE SCHOOL OF PEKING UNIVERSITY

Effective date: 20150731

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150731

Address after: 518057 South Guangdong province Shenzhen city Nanshan District high tech Zone South Road seven East Building 502 production base in Shenzhen and Hong Kong

Applicant after: PKU-HKUST Shenzhen-Hongkong Institution

Applicant after: Shenzhen Graduate School of Peking University

Address before: 518055 Guangdong city in Shenzhen Province, Nanshan District City Xili Shenzhen University North Campus

Applicant before: Shenzhen Graduate School of Peking University

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20121010