CN102253718B - Three-dimensional hand-written inputting method - Google Patents
Three-dimensional hand-written inputting method Download PDFInfo
- Publication number
- CN102253718B CN102253718B CN201110253906.0A CN201110253906A CN102253718B CN 102253718 B CN102253718 B CN 102253718B CN 201110253906 A CN201110253906 A CN 201110253906A CN 102253718 B CN102253718 B CN 102253718B
- Authority
- CN
- China
- Prior art keywords
- personnel
- hands
- arithmetic element
- acquisition unit
- image acquisition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of three-dimensional hand-written inputting method, when hands is lifted by personnel, enter text input mode, when the hands of personnel moves forward, image acquisition unit starts to capture the motion track of hands, when the hands of personnel is moved rearwards by, stop capturing the motion track of hands, and the motion track captured is sent to arithmetic element, being compared in this motion track and lteral data storehouse by arithmetic element, then will with the immediate text importing of this motion track in display unit.When handing down of personnel, leave text input mode.
Description
Technical field
The present invention relates to a kind of three-dimensional hand-written inputting method, particularly relate to one and utilize image capture to carry out three-dimensional hand-written inputting method with analysis.
Background technology
Existing hand-written inputting method mainly carries out on a Trackpad, by detecting finger or pen at the motion track of Trackpad, reaches the purpose of handwriting input.
The game of three-dimensional body-sensing is very popular with device now, but is input word in the way of dummy keyboard in the device of these three-dimensional body-sensing, for the personage of uncomfortable use input through keyboard, really less convenient.
Summary of the invention
It is an object of the invention to provide a kind of three-dimensional hand-written inputting method, as long as user moves the effect that can reach input in space with finger.
One embodiment of the three-dimensional hand-written inputting method of the present invention comprises the following steps: to provide an image acquisition unit and an arithmetic element;Captured the image of personnel by this image acquisition unit continuously, and this image is sent to this arithmetic element is analyzed;When the hands of these personnel in this image lifts, this arithmetic element judges that these personnel are intended to carry out word input;When this hands of these personnel starts mobile, this arithmetic element judges the moving direction of this hands of these personnel;When this moving direction of this hands of these personnel is directed towards this image acquisition unit, this arithmetic element judges that these personnel start to write;After judging to start to write, this image acquisition unit capture the motion track of this hands of these personnel, and this motion track is sent to this arithmetic element, and convert word to and input;When this moving direction of this hands of these personnel is remote from this image acquisition unit, this arithmetic element judges that these personnel stop writing;And when these personnel in this image put down hands, this arithmetic element judges that these personnel terminate word input.
In the above-described embodiment, it is characterized in that when the hands of these personnel in this image lifts, this arithmetic element judges to be intended to carry out word input, the now coordinate of this hands of this arithmetic element these personnel of record, and this setting coordinate is become reference coordinate, this delivery unit judges the moving direction of this hands of these personnel according to this reference coordinate.
In the above-described embodiment, it is characterised in that this image acquisition unit includes a sensor that can carry out 3-D photography.This arithmetic element judges that by the change of the three-dimensional coordinate of this hands of these personnel this hands of this personnel is close to or away from this image acquisition unit.
In the above-described embodiment, it is characterised in that this image acquisition unit includes a sensor that can carry out two dimension photography.This arithmetic element judges that by the change of the size of the image of this hands of these personnel this hands of this personnel is close to or away from this image acquisition unit.
The above embodiments farther include the following step: when this arithmetic element judges that these personnel stop writing, the motion track of this hands of these personnel is not converted into word input.
The above embodiments farther include the following step: provide a lteral data storehouse and a display unit;This motion track is in contrast to the word in this article numerical data base by this arithmetic element, then demonstrates immediate how several word on this display unit, and is selected by these personnel.
In the above-described embodiment, it is characterised in that this motion track is shown on this display unit.
In the handwriting sckeme of the present invention, moving forward with hands is judged as starting writing writes, and the hands stopping that being moved rearwards by being judged as starting writing is write, meet intuition and custom that the mankind write, easy to spread with use, and owing to meeting the custom that the mankind use, it is easier to ground is used in device and the game of body-sensing.
Accompanying drawing explanation
Fig. 1 has been the block chart of the input system of the three-dimensional hand-written inputting method of the present invention.
Fig. 2 to Fig. 8 is the schematic diagram of the three-dimensional hand-written inputting method of the present invention.
Fig. 9 A, 9B are the flow chart of the three-dimensional hand-written inputting method of the present invention.
[primary clustering symbol description]
10 image acquisition units
15 image capture scopes
20 display units
30 arithmetic elements
40 storage elements
100 personnel
102 handss
Detailed description of the invention
Fig. 1 has been the block chart of the input system of the three-dimensional hand-written inputting method of the present invention.Fig. 2 to Fig. 8 is the schematic diagram of the three-dimensional hand-written inputting method of the present invention.Fig. 9 A, 9B are the flow chart of the three-dimensional hand-written inputting method of the present invention.
As it is shown in figure 1, the three-dimensional hand-written inputting method of the present invention is at least to realize by image acquisition unit 10, display unit 20, arithmetic element 30 and a storage element 40.Image acquisition unit 10 is connected to arithmetic element 30, and display unit 20 is also connected to arithmetic element 30.The image that image acquisition unit 10 captures is transferred into arithmetic element 30 and is analyzed, and display unit 20 can show that arithmetic element 30 is intended to the content of display, for instance the content of input.Storage element 40 stores a lteral data storehouse.The present invention does handwriting input with the gesture of personnel, and the motion track that gesture produces can compare with the lteral data storehouse in storage element 40, and the content being intended to input can be shown by display unit 20.
As shown in Fig. 2 to Fig. 8, one personnel 100 stand within the image capture scope 15 of image acquisition unit 10, image acquisition unit 10 can capture the image of personnel 100 continuously, and the image captured is transferred into arithmetic element 30 and is analyzed, and the result of analysis is shown in display unit 20.
As shown in Figure 2, when personnel 100 stand in image capture scope 15, image acquisition unit 10 captures the image of personnel 100, and it is sent to arithmetic element 30, arithmetic element 30 judges that the hands 102 of personnel 100 (is for the right hand in the present embodiment, but be not limited to the right hand, left hand can also) whether lift.
As shown in Figure 3, when the hands 102 of personnel 100 lifts, arithmetic element 30 judges that personnel 100 are intended to carry out word input, the now coordinate of the hands 102 of arithmetic element 30 record keeping personnel 100, and this setting coordinate is become a reference coordinate, arithmetic element 30 judges the moving direction of the hands 102 of personnel 100 according to reference coordinate.
As shown in Figure 4, when the hands 102 of personnel 100 starts mobile, arithmetic element 30 judges the moving direction of the hands 102 of personnel 100 by the image that image acquisition unit 10 transmits.When the hands 102 (direction towards close to image acquisition unit 10) forward of personnel 100 is mobile, arithmetic element 30 judges into personnel 100 and is intended to start to write.
As it is shown in figure 5, when arithmetic element 30 judges that personnel 100 are intended to start to write, image acquisition unit 10 captures the motion track of the hands 102 of personnel 100, and is sent to arithmetic element 30, and this motion track can synchronously be shown on display unit 20.
As shown in Figure 6, when the hands 102 of personnel 100 is backward (away from image acquisition unit 10), arithmetic element judges that personnel stop writing.Now this motion track and the lteral data storehouse being stored in storage element 40 are carried out right by arithmetic element 30, and will be closest to the text importing of this motion track on display unit 20.Another kind of kenel be by the several text importing close with this motion track on display unit 20, select for personnel 100.When arithmetic element 30 judges that personnel 100 stop writing, even if personnel 100 move hands 102, arithmetic element 30 is only considered as simple movement for its motion track, without doing the judgement write, without capturing its motion track.When the hands 102 of personnel 100 forwards moves once again, arithmetic element 30 just can judge into personnel 100 once again and be intended to start to write.
As shown in Figure 7, Figure 8, when hands 102 is put down by personnel 100, arithmetic element 30 judges that personnel 100 are intended to terminate word input.
Fig. 9 A and 9B represents the three-dimensional hand-written inputting method of the present invention, after personnel enter image capture scope, the image of acquisition personnel, then judge whether the hands of the personnel in image lifts, if the hands of personnel lifts, then enter text input mode, in text input mode, if the hands of personnel moves forward, expression personnel start to write, now capture the motion track of the hands of personnel, when the hands of personnel is moved rearwards by, expression personnel stop writing, so, the motion track captured is compared with the word in literal pool, then by text importing in display unit.When handing down of personnel, terminate text input mode.When personnel leave image capture scope, stop image capture.
Image acquisition unit 10 can be the sensor of two dimension photography, such as general network camera, industrial photography machine etc., arithmetic element 30 judges that by the change of the size of the image of the hands 102 of personnel 100 hands 102 of personnel 100 is close to or away from image acquisition unit 10.
Image acquisition unit 10 can also be the sensor of 3-D photography, such as camera adds infrared ray sensor, stereoscopic camera etc., the three-dimensional coordinate of fechtable hands 102, arithmetic element 30 judges that by the change of the three-dimensional coordinate of the hands 102 of personnel 100 hands 102 of personnel 100 is close to or away from image acquisition unit 10.
Although in the above-described embodiment, it is to raise one's hand as the feature starting or terminating text input mode, but the present invention is not limited to this, using other gesture as starting text input mode or terminating also may be used.
In the handwriting sckeme of the present invention, moving forward with hands is judged as starting writing writes, and the hands stopping that being moved rearwards by being judged as starting writing is write, meet intuition and custom that the mankind write, easy to spread with use, and owing to meeting the custom that the mankind use, it is easier to ground is used in device and the game of body-sensing.
Claims (5)
1. a three-dimensional hand-written inputting method, is characterized by comprise the following steps:
One image acquisition unit, an arithmetic element, a lteral data storehouse and a display unit are provided;
Captured the image of personnel by this image acquisition unit continuously, and this image is sent to this arithmetic element is analyzed;
When the hands of these personnel in this image lifts, this arithmetic element judges that these personnel are intended to carry out word input, now the coordinate of this hands of this arithmetic element these personnel of record, and this setting coordinate is become reference coordinate;
When this hands of these personnel starts mobile, this arithmetic element judges the moving direction of this hands of these personnel according to this reference coordinate;
When this moving direction of this hands of these personnel is directed towards this image acquisition unit, this arithmetic element judges that these personnel start to write;
After being judged to start to write, the motion track of this hands of these personnel is captured by this image acquisition unit, and this motion track is sent to this arithmetic element, and this motion track is in contrast to the word in this article numerical data base, then on this display unit, demonstrate this motion track and immediate how several word, and selected by these personnel;
When this moving direction of this hands of these personnel is remote from this image acquisition unit, this arithmetic element judges that these personnel stop writing;
When this arithmetic element judges that these personnel stop writing, the motion track of this hands of these personnel is not converted into word input;When the hands of personnel forwards moves once again, this arithmetic element judges into personnel once again and is intended to start to write;And
When these personnel in this image put down hands, this arithmetic element judges that these personnel terminate word input.
2. three-dimensional hand-written inputting method as claimed in claim 1, it is characterised in that this image acquisition unit includes a sensor that can carry out 3-D photography.
3. three-dimensional hand-written inputting method as claimed in claim 2, it is characterised in that this arithmetic element judges that by the change of the three-dimensional coordinate of this hands of these personnel this hands of this personnel is close to or away from this image acquisition unit.
4. three-dimensional hand-written inputting method as claimed in claim 1, it is characterised in that this image acquisition unit includes a sensor that can carry out two dimension photography.
5. three-dimensional hand-written inputting method as claimed in claim 4, it is characterised in that this arithmetic element judges that by the change of the size of the image of this hands of these personnel this hands of this personnel is close to or away from this image acquisition unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110253906.0A CN102253718B (en) | 2011-08-31 | 2011-08-31 | Three-dimensional hand-written inputting method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110253906.0A CN102253718B (en) | 2011-08-31 | 2011-08-31 | Three-dimensional hand-written inputting method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102253718A CN102253718A (en) | 2011-11-23 |
CN102253718B true CN102253718B (en) | 2016-07-06 |
Family
ID=44981021
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110253906.0A Active CN102253718B (en) | 2011-08-31 | 2011-08-31 | Three-dimensional hand-written inputting method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102253718B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102810008B (en) * | 2012-05-16 | 2016-01-13 | 北京捷通华声语音技术有限公司 | A kind of air input, method and input collecting device in the air |
CN103869954A (en) * | 2012-12-17 | 2014-06-18 | 联想(北京)有限公司 | Processing method as well as processing device and electronic device |
CN104978010A (en) * | 2014-04-03 | 2015-10-14 | 冠捷投资有限公司 | Three-dimensional space handwriting trajectory acquisition method |
CN104793747A (en) * | 2015-04-24 | 2015-07-22 | 百度在线网络技术(北京)有限公司 | Method, device and system for inputting through wearable device |
CN105278861B (en) * | 2015-11-02 | 2020-06-16 | Oppo广东移动通信有限公司 | Control method and device for preset input mode |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100377043C (en) * | 2002-09-28 | 2008-03-26 | 皇家飞利浦电子股份有限公司 | Three-dimensional hand-written identification process and system thereof |
CN1881994A (en) * | 2006-05-18 | 2006-12-20 | 北京中星微电子有限公司 | Method and apparatus for hand-written input and gesture recognition of mobile apparatus |
CN101625607A (en) * | 2009-08-17 | 2010-01-13 | 何进 | Finger mouse |
CN201859394U (en) * | 2010-08-05 | 2011-06-08 | 上海飞来飞去多媒体创意有限公司 | Gesture control system for dynamic video in projection space |
-
2011
- 2011-08-31 CN CN201110253906.0A patent/CN102253718B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN102253718A (en) | 2011-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI659331B (en) | Screen capture method and device for smart terminal | |
CN102253718B (en) | Three-dimensional hand-written inputting method | |
US20220375269A1 (en) | Pinch gesture detection and recognition method, device and system | |
US20140300542A1 (en) | Portable device and method for providing non-contact interface | |
JP2011022945A5 (en) | Interactive operation device | |
US9213410B2 (en) | Associated file | |
JP2015523583A5 (en) | ||
WO2011156111A3 (en) | Virtual touch interface | |
US20180063397A1 (en) | Wearable device, control method and non-transitory storage medium | |
US9213413B2 (en) | Device interaction with spatially aware gestures | |
WO2015051048A1 (en) | Providing intent-based feedback information on a gesture interface | |
KR101631011B1 (en) | Gesture recognition apparatus and control method of gesture recognition apparatus | |
CN102902356B (en) | A kind of gestural control system and control method thereof | |
US20150277570A1 (en) | Providing Onscreen Visualizations of Gesture Movements | |
US9525906B2 (en) | Display device and method of controlling the display device | |
US20130076909A1 (en) | System and method for editing electronic content using a handheld device | |
CN104714650B (en) | A kind of data inputting method and device | |
CN102662592B (en) | A kind of data output method and device | |
WO2024055748A9 (en) | Head posture estimation method and apparatus, and device and storage medium | |
CN103123750A (en) | Digital picture frame combining language studying function and display method thereof | |
CN103186264A (en) | Touch control electronic device and touch control method thereof | |
CN104571603B (en) | A kind of aerial hand writing system and writing pencil | |
CN202815786U (en) | Interaction module group and interaction mobile phone | |
TWI540516B (en) | Three dimensional hand-writing input method | |
KR101506197B1 (en) | A gesture recognition input method using two hands |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C53 | Correction of patent for invention or patent application | ||
CB02 | Change of applicant information |
Address after: 200020, No. 200, Taicang Road, Shanghai, Huangpu District, 21 floor Applicant after: Utechzone Information Technology (Shanghai) Co., Ltd. Address before: 201109, 1525 North Road, Minhang District, Shanghai, Room 202 Applicant before: Utechzone Information Technology (Shanghai) Co., Ltd. |
|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |