WO2001042871A1 - A portable communication device and method - Google Patents
A portable communication device and method Download PDFInfo
- Publication number
- WO2001042871A1 WO2001042871A1 PCT/SE2000/002403 SE0002403W WO0142871A1 WO 2001042871 A1 WO2001042871 A1 WO 2001042871A1 SE 0002403 W SE0002403 W SE 0002403W WO 0142871 A1 WO0142871 A1 WO 0142871A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sub
- communication device
- sound
- processing device
- different
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/043—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
Definitions
- the present invention relates to a portable communication device with a microphone, the device further comprising a processing device, which works as a controller, for receiving a hand-written message from a user and converting the message into digital data. Furthermore, the present invention relates to a method of detecting sound by the microphone and converting the sound into digital data.
- a communication device such as a mobile telephone, or a personal data organizer
- the common way of entering information is by using a keyboard.
- the same key on the keyboard is often used to enter several letters, and by combining different keys and sequences different kind of messages can be entered.
- Another solution is to use a touch screen as an input device. Text can be written by hand on the screen and the letters recognised using software, or the touch screen can be used for entering commands .
- Viewing of a large screen in small equipment can be done by using a micro-display.
- the main problem with the first mentioned solution is that the keyboard and the display are often very small . There are normally not enough keys for the whole alphabet, and several keys have to be pressed in order to input a single letter.
- the second solution with a touch screen on top of the display gives a lower contrast of the display and makes the display thicker and more expensive. Small displays also make it difficult to get a high resolution when a touch screen is being used with a portable device, and when using a micro-display a touch screen can not be used as input.
- the third solution uses an extended display, which increases the size and the cost for the device.
- the fourth solution involves a big unpractical display, and a specific pen or needle can only be used for entering characters, whereby the access to the device for another user without this specific pen is limited, and if the pen is forgotten the pad or data surface is useless.
- the object of the present invention is to provide a communication device with a possibility of entering hand-written messages easier and quicker.
- This object is achieved for a portable communication device comprising a microphone and a processing device for receiving a message hand-written by a user on a surface of the device and converting the message into digital data.
- the processing device is operatively connected to the microphone, so that different sounds caused by movement over different patterns and/or textures on a surface of the communication device when the user is hand-writing on said surface may be acoustically detected by the microphone and forwarded, in electric form, to the processing device.
- the movement of e g a pen or stylus over different combinations of said patterns and/or textures on the surface corresponds to different text inputs or commands .
- a surface on the communication device has a specific design, wherein the surface is divided into several areas, which have sub patterns for creating specific sounds when e g the pen or the stylus in contact with the surface moves against these sub patterns .
- the processing device By providing the communication device with a cover, which has a specific surface, and the processing device according to the invention, the following advantages are obtained. Input of information is simplified by not having to use the small and unpractical keys, thereby avoiding the time consuming pressing of keys. Notes and messages may thereby be entered into the device easier and quicker Commands for different functions can also be entered and executed with sound recognition. Furthermore, there will be no significant extra cost to include a processing device m a communication device that already comprises a microphone, since all hardware is already installed Any fairly sharp object can also be used for entering messages .
- the object of the invention has also been achieved by a method of detecting and recognising sound by means of the microphone together with the processing device, and converting the sound into digital data.
- the digital data may be forwarded to the display of the communication device or used for executing commands that control the communication device.
- FIG 1 is a perspective view showing a communication device
- FIG 2 is a perspective view showing a communication device in a position, where information can be entered on a cover according to the invention
- FIG 3 is a front view showing a preferred embodiment of a surface on a cover according to the invention
- FIG 4 is a frontview showing a preferred embodiment of a surface on a cover according to the invention
- FIG 5 is a view in section showing a preferred embodiment of a pattern on a surface according to the invention
- FIG 6 is a view in section showing another preferred embodiment of a pattern on a cover according to the invention
- FIG 7 is a view in section showing yet another preferred embodiment of a pattern on a cover according to the invention
- FIG 8 is a view in section further showing another preferred embodiment of a pattern on a cover according to the invention.
- FIG 9 is a flow chart showing a method of detecting, recognising, converting and entering data according to the invention into a communication device.
- FIGS 1-2 illustrate a communication device 1 with a cover 3.
- FIGS 3-4 show two different embodiments of a surface 3' on the cover 3.
- the communication device is a handheld device for entering messages into it or commands for controlling it, such as a mobile or cellular telephone.
- the device comprises a casing 2, the cover 3, a display 4, at least one microphone 5, a keypad 6, an antenna 8, loudspeaker openings 9, and a processing device, which is operatively connected to the display 4, the microphone 5 and the keypad 6. More than one microphone 5 could be used for a more accurate detection of sound and position of a tip 7 of a pen, a stylus, or a finger nail during handwriting.
- the processing device comprises a controller (such as a central processing unit (CPU) , a digital signal processor (DSP) or a programmable logic array (PLA)) together with a memory (such as an electrically erasable programmable read-only memory (EEPROM) , a flash memory or similar means) and a set of program instructions stored in the memory and executable by the controller.
- a controller such as a central processing unit (CPU) , a digital signal processor (DSP) or a programmable logic array (PLA)
- EEPROM electrically erasable programmable read-only memory
- flash memory or similar means
- the processing device could be realized solely in hardware, for instance as an application- specific integrated circuit (ASIC) , discrete components or other components fulfilling the demands.
- ASIC application- specific integrated circuit
- the communication device 1 has a movable flip or cover 3 for protecting the keypad 6 against damage and unintentional input.
- the cover 3 can be opened, thereby supporting a pen 7 or another object with a fairly sharp tip, such as a finger nail, when it is being used for handwriting on the communication device 1.
- the microphone 5 is preferably placed in the cover 3 close to a position where a user's mouth would be placed during use of a telephone mode, but can also be placed on the cover 3 at point 5 ' , alone or with any number of microphones, closer to the communication device 1, or even in the casing 2 of the device 1 adjacent point 5", or in any other location in the device 1. This gives a more accurate detection of the sound due to a triangularity effect between several cooperating microphones instead of just one microphone.
- the user can either use a pen 7 or another sharptipped object when writing by hand.
- the sound caused by the user during movement of the pen 7 against the cover 3 is registered by the microphone 5 and forwarded, in electric form, to the processing device.
- the sound, in electric form is recognised by means of the processing device and suitable software.
- the sound is converted into digital data, representing e g a cursor movement or an entered text, and is forwarded to the display 4 of the communication device 1 by means of the processing device.
- the sound, in the form of digital data can also be used as commands for executing desired functions and controlling the communication device 1.
- the cover 3 can be made slightly rugged and/or embossed in specific areas so that different patterns create different and distinct sounds, whereby the sound may be more easily recognised.
- FIG 3 a preferred embodiment of a surface 3 ' on the cover 3 is shown.
- the surface 3' is divided into several areas 3a, 3b, as shown in FIGS 3 and 4.
- the surface 3' consists of at least one area 3a in the centre and at least one outer sub area 3b surrounding the centre area 3a.
- the sub areas 3a and 3b are intended to define which characters to be entered when writing a message by hand. This is done by arranging a specific sub pattern 3" for each sub area 3b between the centre area 3a and its adherent sub area 3b. This means that a respective character, corresponding to each sub area 3b and its sub pattern 3", is entered by moving a tip 7 of a fairly sharp object twice over the sub pattern 3" in two different directions.
- the surface 3 1 could also be arranged on any other separate substantially planar surface, that is operatively connected to the processing device of the communication device 1, such as e g a separate pad, a table or any other handheld device for entering text or commands .
- the most important when entering a hand-written message into the device 1 is that the movement of the tip 7 has to start and end in the centre area 3a.
- the passing of the different sub patterns 3" defines the desired character to be entered into the communication device 1 due to the specific sound caused when the tip 7, in contact with the surface 3', passes over a respective sub pattern 3".
- the start movement out of the centre area 3a past the specific sub pattern 3", and into the first outer sub area 3b, defines from which group of characters a desired character to be entered is chosen.
- the second movement from the first sub area 3b to another sub area, and past its sub pattern 3" back into the centre area 3a, defines the position of the desired character to be entered from the chosen group of characters. This means that a character can be entered in one continous movement without lifting the tip 7.
- the tip 7 can be moved with different velocity and pressed against the surface 3 ' with various force by different users.
- the tip 7 can also be made of different materials, i e harder or softer.
- the processing device has an optional function, that can recognise the sound, and forwards the right information independent of what kind of tip 7 being used or with which force the user presses the tip 7 against the surface 3 ' .
- This function is a teaching mode, which is entered by pressing a key 6.
- the teaching of the processing device is done, after initiating this mode, by a movement of the tip 7 against a specific sub pattern 3" on the surface 3' .
- the sound created by this movement over the specific sub pattern 3" is detected, registered, and stored for later use as reference sound.
- a character to be entered can not be found in any of the sub areas 3b, other groups of characters can be chosen by moving the tip 7 in and out from the centre area 3a, i e back and forth over one or two specific sub patterns 3", or by tapping one or more times on the cover 3 , or alternatively by pressing a key on the keypad 6.
- This change of character groups can be done by using the processing device for recognition of the sound caused by movement or tapping on the surface, or for recognition of the pressing of keys .
- the change can be indicated by a symbol or symbols on the display 4 of the device 1 and may show which group of characters are going to be used for entering the desired information.
- Any sub pattern can be defined to control the change between different character groups by means of symbols on the surface of the cover 3 , and these symbols can also be presented simultaneously on the display 4 of the communication device 1.
- These symbols can be designed in many ways, e g as circles, squares, triangles, stars, punctuation marks, or have any other form and can be defined to have different functions, e g changing the selectable character groups from capital letters to small letters, or to figures, or alternatively to parentheses, or for executing different commands.
- a preferred group of symbols is a square initiating a stop, a circle, a triangle pointing upwards initiating activation of a mode for entering upper cases or • lower cases, a triangle pointing to the right for entering space, and a third triangle pointing to the left for deleting the last written character.
- the surface 3' has eight outer sub areas 3b.
- Each sub area 3b has a sub pattern 3", which consists of at least one groove or rib for creating a specific sound when the tip 7 passes over it .
- This sound has to be unique for each sub area 3b, whereby each of the sub patterns 3" has a different pattern consisting of a different number of grooves or ribs arranged in different combinations with different mutual distances, preferably eight different patterns of ribs A, B, C, D, E, F, G and H, shown in more detail in FIGS 5, 6, 7 and 8.
- Each of the sub areas 3b may also have a surface that is rugged, i e a surface with a lot of very small grooves or ribs, similar to the small ribs shown in FIG 7 as pattern C and E' of the patterns C and E, which are smaller than said sub patterns 3", for creating a basic sound when the tip 7 in contact with the sub areas 3b moves over them.
- This basic sound facilitates the sound recognition by means of a more exact detecting of the position of the tip 7.
- the centre area 3a should be as smooth as possible in relation to the sub areas 3b for the same reason. A sequence of movements is shown for clarity reasons in the form of bold lines on the surface 3 ' for entering a character.
- the sub pattern A is then passed by the tip 7 and defines from which group the desired character, e g the small letter f, is going to be chosen. Then the tip 7 is moved from the first sub area, after passing the sub pattern A, into the second sub area, and past the second sub pattern H back into the centre area 3a at position e.
- the second sub pattern H defines the position of the small letter f in the chosen group, i e in the second position in the upper row in the first sub area 3b.
- Another character is entered by moving the tip 7 in a different sequence, e g the lower- case character x is entered by moving the tip 7 from the centre area 3a at point s' , past sub pattern A into the sub area in the upper right hand corner of surface 3 ' , and into a second sub area followed by a third sub area 3b located in the lower right hand corner of surface 3 ' , and back into the centre area 3a at point e' past sub pattern C.
- FIG 4 Another sequence of movements is shown in FIG 4 in the form of bold lines on the surface 3 ' for entering another character.
- the first movement from position s" in the centre area 3a past sub pattern H into the first sub area defines a change of character group.
- the movement from this first sub area, past the sub pattern H once more, back into the centre area 3a at position e" defines a change from lower-case letters to capital letters, and wherein the symbol in the middle of the three symbols above sub pattern H is activated.
- the subsequent "second start” movement from the sub area 3a at position s ⁇ , past sub pattern G and into sub area 3b defines from which character group the character is going to be chosen.
- FIG 5 shows a preferred embodiment of a pattern 3" with eight sub patterns A, B, C, D, E, F, G, H with different combinations of ribs, wherein each sub pattern can be combined with any sub area 3b.
- the illustration in section is done with reference to only one sub pattern F along line x-x in FIG 3 and 4.
- Other designs of the pattern instead of ribs can be used, such as grooves or any other form, as shown in FIG 6.
- the ribs can also be round, flat or have any other design, as shown in an enlarged scale in FIG 8.
- the ribs can also have different sizes, creating sounds with higher frequencies, as shown by sub pattern C and E' in FIG 7.
- the different sub patterns in FIG 5 will be explained in the form of combinations of straight arrows (f) pointing upwards.
- Sub pattern A consists of one rib (]")
- sub pattern B consists of two ribs (f ) with a mutual distance d
- Sub pattern C has three ribs, wherein a pair of ribs have a mutual distance d and the third rib is arranged a distance more than 2d from the first pair ( 1"f )
- Sub pattern D also includes three ribs, where one rib is placed adjacent the centre area 3a and the other two ribs are placed a distance more than 2d from the first rib (f ff)
- Sub pattern E comprises four ribs with two pair of ribs with a mutual distance d placed a distance more than 2d from each other (f ff ) .
- Sub pattern F consists of three ribs with a mutual distance d (ft?) •
- Su k pattern G has four ribs with a mutual distance d ( ) •
- anc ⁇ su ⁇ ° pattern H consists of two ribs with a mutual distance more than 2d from each other
- FIG 6 shows a preferred embodiment of a pattern 3" with eight sub patterns A, B, C, 'D, E, F, G, H with different combinations of grooves instead of ribs as in FIG 5.
- the illustration in section is done with reference to only one sub pattern F along line x-x in FIG 3 and 4.
- the mutual distances between the grooves are the same as in FIG 5.
- the grooves can have any other design and size, such as a more shallow and/or wider form, or have a more rounded bottom.
- FIG 7 shows another preferred embodiment of a pattern 3" with eight sub patterns A, B, C, D, E, F, G, H with different combinations of ribs as in FIG 5.
- the illustration in section is done with reference to only one sub pattern F along line x-x in FIG 3 and 4.
- Distances and sizes of the ribs are the same as all the patterns 3" in FIG 5 except for sub pattern E and C.
- Sub pattern C has three ribs, wherein a pair of ribs have a mutual distance d and a third rib is arranged a distance more than 2d from the first pair.
- the area between the single rib and the pair of ribs is equipped with several small ribs similar to the basic pattern in the sub areas 3b for creating a kind of basic sound ( t f) .
- Sub pattern E comprises two bigger ribs with a mutual distance d placed a distance more than 2d from each other.
- the area between the pair of ribs has several small ribs creating a pattern similar to the basic pattern in the sub areas 3b (ff ft") .
- FIG 8 shows yet another preferred embodiment of a sub pattern 3" with eight sub patterns A, B, C, D, E, F, G, H as in FIG 5, 6 and 7.
- the sub patterns will be explained with the help of the same arrows (f) as in the above description of FIG 5.
- the sub patterns are illustrated as arrows (f) in FIG 8 for simplicity reasons .
- the sub patterns are shown with reference to only one sub pattern F along line x-x in FIG 3 and 4.
- Sub pattern A has three ribs with a mutual distance of more than 2d from each other (f f t) •
- Sub pattern B has four ribs consisting of a pair of ribs with a mutual distance d and two other ribs with a distance of more than 2d from each other and the first pair (TT f f) .
- Sub pattern C includes three pair of ribs, i e a total of six ribs, with mutual distance of more than 2d between each pair, wherein the arrows in each pair have a mutual distance d (TT TT TT) •
- Sub pattern D has five ribs (TT TT f) .
- Sub pattern E includes four ribs with a mutual distance of more than 2d (T T T T) -
- Sub pattern F has five ribs but in another combination with one rib placed a distance more than 2d from the other four ribs with a mutual distance d ( f TTT ) •
- Sub pattern G consists of five ribs in a combination similar to F (TTTT T) •
- Sub pattern H has six ribs with a first pair of ribs with a mutual distance d placed a distance more than 2d from the other four ribs with a mutual distance d (TT TTTT) •
- FIG 8 also shows six enlargements disclosing different designs and combinations of ribs or grooves that may be used .
- the cover 3 is used for entering a handwritten message, moving a cursor on the display 4 or executing commands for controlling the device 1.
- This can be accomplished by the following steps shown in FIG 9.
- the first step 910 involves opening of the cover 3 followed by a registration in step 920 of whether the telephone mode is to be used alone, step 925, or in combination with sound recognition in step 930. If sound recognition of handwriting on the cover 3 is going to be used, then one tap with the pen 7 in the middle of the cover 3 or pressing of a key on the keypad 6 has to be done to initiate the sound recognition function.
- the tip 7 is moved on the surface 3 ' of the cover 3 , thereby creating a sound that is detected in step 930 by means of a microphone 5, converted into electric form in a step 940, forwarded to the processing device in a step 950, and recognised with suitable software in a step 960.
- the recognised information is then converted into digital data by means of the processing device, shown in a step 970, and recognised as text or commands in a step 980. If the digital data is intended for moving a cursor or entering text, it is forwarded to the display 4 as in step 985, or, alternatively, if the digital data is a command, it is executed for controlling a specific function in the communication device 1 in the last step 990.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Telephone Set Structure (AREA)
- User Interface Of Digital Computer (AREA)
- Radar Systems Or Details Thereof (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE60015383T DE60015383D1 (en) | 1999-12-08 | 2000-12-01 | Portable communication device and method |
JP2001544097A JP2003516576A (en) | 1999-12-08 | 2000-12-01 | Portable communication device and communication method thereof |
AU19120/01A AU1912001A (en) | 1999-12-08 | 2000-12-01 | A portable communication device and method |
AT00982044T ATE280971T1 (en) | 1999-12-08 | 2000-12-01 | PORTABLE COMMUNICATION DEVICE AND METHOD |
EP00982044A EP1236076B1 (en) | 1999-12-08 | 2000-12-01 | A portable communication device and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE9904469A SE515238C2 (en) | 1999-07-05 | 1999-12-08 | Mobile phone, includes microphone for converting noise from hand movement over phone surface into e.g. text, cursor movement or command key activation on display screen |
SE9904469-5 | 1999-12-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2001042871A1 true WO2001042871A1 (en) | 2001-06-14 |
Family
ID=20418028
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE2000/002403 WO2001042871A1 (en) | 1999-12-08 | 2000-12-01 | A portable communication device and method |
Country Status (8)
Country | Link |
---|---|
US (1) | US20010003452A1 (en) |
EP (1) | EP1236076B1 (en) |
JP (1) | JP2003516576A (en) |
CN (1) | CN1188775C (en) |
AT (1) | ATE280971T1 (en) |
AU (1) | AU1912001A (en) |
DE (1) | DE60015383D1 (en) |
WO (1) | WO2001042871A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002103620A2 (en) * | 2001-06-14 | 2002-12-27 | Koninklijke Philips Electronics N.V. | Data input tablet |
EP1513049A2 (en) * | 2003-07-08 | 2005-03-09 | NTT DoCoMo, Inc. | Input key and input apparatus |
WO2008047294A2 (en) * | 2006-10-18 | 2008-04-24 | Koninklijke Philips Electronics N.V. | Electronic system control using surface interaction |
US7966084B2 (en) | 2005-03-07 | 2011-06-21 | Sony Ericsson Mobile Communications Ab | Communication terminals with a tap determination circuit |
US10254953B2 (en) | 2013-01-21 | 2019-04-09 | Keypoint Technologies India Pvt. Ltd. | Text input method using continuous trace across two or more clusters of candidate words to select two or more words to form a sequence, wherein the candidate words are arranged based on selection probabilities |
US10474355B2 (en) | 2013-01-21 | 2019-11-12 | Keypoint Technologies India Pvt. Ltd. | Input pattern detection over virtual keyboard for candidate word identification |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001306254A (en) * | 2000-02-17 | 2001-11-02 | Seiko Epson Corp | Inputting function by slapping sound detection |
US6916321B2 (en) * | 2001-09-28 | 2005-07-12 | Ethicon, Inc. | Self-tapping resorbable two-piece bone screw |
US20030076408A1 (en) * | 2001-10-18 | 2003-04-24 | Nokia Corporation | Method and handheld device for obtaining an image of an object by combining a plurality of images |
KR100857254B1 (en) * | 2007-08-02 | 2008-09-05 | 주식회사 로직플랜트 | Display control method, mobile terminal of using the same and recording medium thereof |
GB0801396D0 (en) * | 2008-01-25 | 2008-03-05 | Bisutti Giovanni | Electronic apparatus |
US20110096036A1 (en) * | 2009-10-23 | 2011-04-28 | Mcintosh Jason | Method and device for an acoustic sensor switch |
KR101678549B1 (en) * | 2010-02-02 | 2016-11-23 | 삼성전자주식회사 | Method and apparatus for providing user interface using surface acoustic signal, and device with the user interface |
JP5593851B2 (en) * | 2010-06-01 | 2014-09-24 | ソニー株式会社 | Audio signal processing apparatus, audio signal processing method, and program |
US9148501B2 (en) * | 2012-07-02 | 2015-09-29 | Talkler Labs, LLC | Systems and methods for hands-off control of a mobile communication device |
GB2523137A (en) * | 2014-02-13 | 2015-08-19 | Charles Edmund King | Acoustic tracking means |
CN104898962B (en) * | 2014-03-03 | 2020-04-24 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN109657739B (en) * | 2019-01-09 | 2023-03-24 | 西北大学 | Handwritten letter identification method based on high-frequency sound wave short-time Fourier transform |
US10969873B2 (en) * | 2019-04-12 | 2021-04-06 | Dell Products L P | Detecting vibrations generated by a swipe gesture |
CN114371796B (en) * | 2022-01-10 | 2024-06-04 | 深聪半导体(江苏)有限公司 | Method, device and storage medium for identifying touch position |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4853494A (en) * | 1987-03-20 | 1989-08-01 | Canon Kabushiki Kaisha | Information processing apparatus for inputting coordinate data selectively from either the obverse or the reverse surface of an input tablet |
US5451723A (en) * | 1993-10-18 | 1995-09-19 | Carroll Touch, Inc. | Acoustic wave touch panel for use with a non-active stylus |
DE19508320A1 (en) * | 1995-03-09 | 1996-09-12 | Jan Nieberle | Pen-input interface for personal computer |
US5625354A (en) * | 1996-01-26 | 1997-04-29 | Lerman; Samuel I. | Compact stylus keyboard |
EP0924915A1 (en) * | 1997-11-27 | 1999-06-23 | Nokia Mobile Phones Ltd. | Wireless communication device and a method in manufacturing a wireless communication device |
WO1999050818A1 (en) * | 1998-04-01 | 1999-10-07 | New York University | A method and apparatus for writing |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3909785A (en) * | 1973-11-12 | 1975-09-30 | Amperex Electronic Corp | Apparatus for recognizing hand printed characters |
GB9928682D0 (en) * | 1999-12-06 | 2000-02-02 | Electrotextiles Comp Ltd | Input apparatus and a method of generating control signals |
-
2000
- 2000-12-01 DE DE60015383T patent/DE60015383D1/en not_active Expired - Lifetime
- 2000-12-01 CN CNB00816911XA patent/CN1188775C/en not_active Expired - Fee Related
- 2000-12-01 EP EP00982044A patent/EP1236076B1/en not_active Expired - Lifetime
- 2000-12-01 JP JP2001544097A patent/JP2003516576A/en not_active Withdrawn
- 2000-12-01 WO PCT/SE2000/002403 patent/WO2001042871A1/en active IP Right Grant
- 2000-12-01 AU AU19120/01A patent/AU1912001A/en not_active Abandoned
- 2000-12-01 AT AT00982044T patent/ATE280971T1/en not_active IP Right Cessation
- 2000-12-07 US US09/732,206 patent/US20010003452A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4853494A (en) * | 1987-03-20 | 1989-08-01 | Canon Kabushiki Kaisha | Information processing apparatus for inputting coordinate data selectively from either the obverse or the reverse surface of an input tablet |
US5451723A (en) * | 1993-10-18 | 1995-09-19 | Carroll Touch, Inc. | Acoustic wave touch panel for use with a non-active stylus |
DE19508320A1 (en) * | 1995-03-09 | 1996-09-12 | Jan Nieberle | Pen-input interface for personal computer |
US5625354A (en) * | 1996-01-26 | 1997-04-29 | Lerman; Samuel I. | Compact stylus keyboard |
EP0924915A1 (en) * | 1997-11-27 | 1999-06-23 | Nokia Mobile Phones Ltd. | Wireless communication device and a method in manufacturing a wireless communication device |
WO1999050818A1 (en) * | 1998-04-01 | 1999-10-07 | New York University | A method and apparatus for writing |
Non-Patent Citations (2)
Title |
---|
MAYOL-CUEVAS W.W.: "A first approach to tactile texture recognition", SYSTEMS, MAN AND CYBERNETICS, 1998 IEEE INTERNATIONAL CONFERENCE ON, vol. 5, 11 October 1998 (1998-10-11) - 14 October 1998 (1998-10-14), pages 4246 - 4250, XP000888071 * |
QUIKWRITING: "Constinuous stylus-based text entry", KEN PERLIN, PUBL. UIST 98 CONFERENCE, November 1998 (1998-11-01), XP002951420, Retrieved from the Internet <URL:http://www.mrl.nyu.edu/perlin/demos/quikwriting.html> [retrieved on 20000713] * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002103620A2 (en) * | 2001-06-14 | 2002-12-27 | Koninklijke Philips Electronics N.V. | Data input tablet |
WO2002103620A3 (en) * | 2001-06-14 | 2003-05-15 | Koninkl Philips Electronics Nv | Data input tablet |
US7119796B2 (en) | 2001-06-14 | 2006-10-10 | Koninklijke Philips Electronics N. V. | Data input system |
EP1513049A2 (en) * | 2003-07-08 | 2005-03-09 | NTT DoCoMo, Inc. | Input key and input apparatus |
EP1513049A3 (en) * | 2003-07-08 | 2009-04-22 | NTT DoCoMo, Inc. | Input key and input apparatus |
US7966084B2 (en) | 2005-03-07 | 2011-06-21 | Sony Ericsson Mobile Communications Ab | Communication terminals with a tap determination circuit |
WO2008047294A2 (en) * | 2006-10-18 | 2008-04-24 | Koninklijke Philips Electronics N.V. | Electronic system control using surface interaction |
WO2008047294A3 (en) * | 2006-10-18 | 2008-06-26 | Koninkl Philips Electronics Nv | Electronic system control using surface interaction |
US10254953B2 (en) | 2013-01-21 | 2019-04-09 | Keypoint Technologies India Pvt. Ltd. | Text input method using continuous trace across two or more clusters of candidate words to select two or more words to form a sequence, wherein the candidate words are arranged based on selection probabilities |
US10474355B2 (en) | 2013-01-21 | 2019-11-12 | Keypoint Technologies India Pvt. Ltd. | Input pattern detection over virtual keyboard for candidate word identification |
Also Published As
Publication number | Publication date |
---|---|
EP1236076A1 (en) | 2002-09-04 |
JP2003516576A (en) | 2003-05-13 |
DE60015383D1 (en) | 2004-12-02 |
CN1188775C (en) | 2005-02-09 |
CN1409839A (en) | 2003-04-09 |
ATE280971T1 (en) | 2004-11-15 |
EP1236076B1 (en) | 2004-10-27 |
US20010003452A1 (en) | 2001-06-14 |
AU1912001A (en) | 2001-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1236076B1 (en) | A portable communication device and method | |
US8514186B2 (en) | Handheld electronic device and operation method thereof | |
KR100856203B1 (en) | User inputting apparatus and method using finger mark recognition sensor | |
US7444163B2 (en) | Mobile digital devices | |
KR100949581B1 (en) | Apparatus and method for inputting character and numeral on communication device | |
US7023428B2 (en) | Using touchscreen by pointing means | |
US7505798B2 (en) | Angular keyboard for a handheld mobile communication device | |
EP1271295A2 (en) | Method and device for implementing a function | |
US20060033723A1 (en) | Virtual keypad input device | |
US7646378B2 (en) | System and method for user interface | |
US9104247B2 (en) | Virtual keypad input device | |
JP2004213269A (en) | Character input device | |
EP2613234A1 (en) | User interface, device and method for a physically flexible device | |
US20080166049A1 (en) | Apparatus and Method for Handwriting Recognition | |
EP2073508A1 (en) | A portable electronic apparatus, and a method of controlling a user interface thereof | |
US20040239624A1 (en) | Freehand symbolic input apparatus and method | |
EP1599787A1 (en) | Unambiguous text input method for touch screens and reduced keyboard systems | |
WO2000072300A1 (en) | Data entry device recording input in two dimensions | |
US20060227100A1 (en) | Mobile communication terminal and method | |
US7023426B1 (en) | User input device | |
KR20090049153A (en) | Terminal with touchscreen and method for inputting letter | |
JP2001333166A (en) | Character entry device and character entry method | |
US20100026625A1 (en) | Character input device | |
JP2014167712A (en) | Information processing device, information processing method, and program | |
KR20030067729A (en) | Stylus computer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WR | Later publication of a revised version of an international search report | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2000982044 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 00816911X Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 2001 544097 Kind code of ref document: A Format of ref document f/p: F |
|
WWP | Wipo information: published in national office |
Ref document number: 2000982044 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWG | Wipo information: grant in national office |
Ref document number: 2000982044 Country of ref document: EP |