US20120218194A1 - Virtual keyboard feedback - Google Patents

Virtual keyboard feedback Download PDF

Info

Publication number
US20120218194A1
US20120218194A1 US13/037,124 US201113037124A US2012218194A1 US 20120218194 A1 US20120218194 A1 US 20120218194A1 US 201113037124 A US201113037124 A US 201113037124A US 2012218194 A1 US2012218194 A1 US 2012218194A1
Authority
US
United States
Prior art keywords
tone
virtual key
user
peripheral
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/037,124
Inventor
Richard Ian Silverman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/037,124 priority Critical patent/US20120218194A1/en
Publication of US20120218194A1 publication Critical patent/US20120218194A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Definitions

  • Virtual keyboards can be created on touch sensitive displays by displaying images of physical keys on a flat-screen surface.
  • the virtual keyboards can be configured, for example, in a variety of standardized keyboards (QWERTY, Numeric Keypad) or custom keyboards for various custom applications.
  • the touch-sensitive display detects when an image of a physical key is touched, allowing the input of data based on touch location.
  • a virtual key is displayed by a touch-sensitive display.
  • the virtual key has a center zone and a plurality of peripheral zones surrounding the center zone.
  • a first tone is sounded when the virtual key is touched within the center zone.
  • a second tone is sounded when the virtual key is touched in a first peripheral zone.
  • a third tone is sounded when the virtual key is touched in a second peripheral zone. The first tone is different than the second tone.
  • the first tone is different than the third tone.
  • FIG. 1 is a simplified block diagram of a computing system with a touch sensitive display in accordance with an embodiment of the invention.
  • FIG. 2 shows a simplified virtual keyboard displayed on the touch sensitive display shown in FIG. 1 in accordance with an embodiment of the invention.
  • FIG. 3 is a simplified illustration used to describe how touch of a virtual key on the touch sensitive display shown in FIG. 1 is processed in accordance with an embodiment of the invention.
  • FIG. 4 is a flow diagram illustrating operation of the virtual keyboard shown in FIG. 2 in accordance with an embodiment of the invention.
  • FIG. 5 is an example of a simplified user interface that can be used to vary parameters of virtual keys within a virtual keyboard in accordance with an embodiment of the invention.
  • FIG. 6 is a flow diagram illustrating the process by which the parameters of a virtual key are varied in accordance with an embodiment of the invention.
  • FIG. 1 is a simplified block diagram of a computing system 10 with a touch sensitive display 11 .
  • a touch controller 12 and a display controller 13 provide an interface between touch sensitive display 11 and a processor 16 .
  • An audio device 14 provides sound feedback to a user using a virtual keyboard displayed on touch sensitive display 11 .
  • An audio controller 15 provides an interface between audio device 14 and processor 16 .
  • the virtual keyboard is implemented using application module 18 running within an operating system 17 .
  • a virtual key control module 19 provides control for the virtual keyboard.
  • a parameter adjusting module 20 allows a user to adjust parameters of the virtual keyboard.
  • FIG. 2 shows a simplified virtual keyboard 22 displayed on touch sensitive display 11 .
  • Virtual keyboard 22 is composed of virtual keys that can be arranged, for example, in a variety of keyboard patterns such as for a QWERTY keyboard, a numeric keypad, or a custom key arrangement.
  • touch sensitive display 11 senses that a user has touched a virtual key displayed on touch sensitive display 11 , the value for the key is input into computing system 10 .
  • audio feedback is provided to a user to aid the user in correctly positioning fingers on virtual keyboard 22 .
  • the audio feedback can be configured so that a user will receive sufficient feedback to allow correct positioning of fingers and so that the user will not receive audio feedback that is too complex or otherwise unhelpful when the user, for example, is typing many words per minute or otherwise pressing multiple keys in a single second.
  • FIG. 3 is used to illustrate how touch location within a virtual key 46 from virtual keyboard 22 is processed to provide audio feedback to a user to aid the user in correctly positioning fingers on virtual keyboard 22 .
  • a touch detected within a center region 40 of virtual key 46 will result in a first assigned audio tone being sounded by audio device 14 .
  • a touch detected within one of peripheral regions 41 , 42 , 43 , 44 , and 45 will result in an assigned audio tone other than the first audio tone being sounded by audio device 14 .
  • the audio tone sounded by audio device 14 differs dependent upon which region is the center of a detected touch.
  • the tone can vary from region to region based, for example, on pitch, volume, brightness or duration.
  • the tone can also comprise a harmony of two or more notes, and different harmonies or dissonance between the multiple notes can be sounded by audio device 14 to provide feedback to a user as to which region is the center of a detected touch.
  • center region 40 will cover more than half the area of virtual key 46 .
  • the first assigned audio tone will continue to give feedback to a user indicating the user's fingers are properly located over the virtual keyboard.
  • the alternate audio tone(s) will indicate to the user that the user's finger has touched a virtual key on its outside edge thereby providing feedback to the user that indicates it is necessary to correct positioning of the finger with respect to the virtual key.
  • each of peripheral regions 41 through 45 can be configured so as to have a distinct tone that will vary from the other regions.
  • the tone varies based on one or more of pitch, volume, duration, harmony or dissonance.
  • the differing tones warn the user that the user has touched a virtual key with a touch centering outside center region 40 .
  • the particular variation in tone can indicate to the user whether the touch is in a peripheral region that is left, right, above or below center region 40 . This feedback will encourage the user to re-position hands so that in the future virtual keys will be pressed with touches inside center region 40 .
  • FIG. 3 shows five peripheral regions 41 through 45 surrounding center region 40
  • various embodiments of the invention allow a user to select the number of peripheral regions. For example, a user may find it most helpful to select just two peripheral regions or four peripheral regions to surround center region 40 .
  • touch sensitive display when a user's fingers touches both center region 40 and a peripheral region, touch sensitive display can determine in which region the touch is detected and audio device 14 can emit the tone only for the region in which the touch is detected.
  • audio device 14 when a user's finger touches both center region 40 and a peripheral region, audio device 14 can emit the tones for all the regions that are touched.
  • FIG. 4 is a flow diagram illustrating virtual key controller 19 controlling operation of virtual keyboard 22 , shown in FIG. 2 .
  • touch sensitive display detects a touch. A determination is made as to whether the touch was made within a region of a virtual key within virtual keyboard 22 . If not, nothing is done. Depending upon how the keyboard is implemented, the determination of the location of the touch can be made upon the initial touch on the display or upon the release of the touch.
  • a touch location inside a virtual key is identified.
  • the particular virtual key touched is identified and identification of the touch virtual key is passed to application module 18 for application processing.
  • a block 37 the particular peripheral region receiving the touch is identified.
  • the tone for the particular peripheral region receiving the touch is accessed and stored for play by audio device 14 .
  • the stored tone is played.
  • FIG. 5 provides an example of a user interface 60 that can be used to vary parameters for virtual keys within virtual keyboard 22 .
  • a user can select a parameter to vary by selecting a corresponding radio button.
  • the following parameters are listed in area 62 : set number of zones, set tone pitch in any zone, set zone starting location around circumference, set outer zone staring radius, set outer zone width, and set outer zone shape. In other embodiments, fewer or a greater number of parameters can be listed.
  • a parameter could be included indicating whether a tone with multiple pitches would be sounded when more than one region is touched in a virtual key. Tone duration, harmony and dissonance can also be included as variable parameters in area 62 . And so on.
  • a visual display 61 of a virtual key is provided to give a user feedback into results of parameter changes. For example, when a user varies number of zones, the selected number of zones is shown in visual display 61 . For example, currently three peripheral zones 65 , 66 and 67 are shown. When the user sets a different number of zones, the number selected by the user is reflected by changes to visual display 61 .
  • location of a zone starting location 64 shown in visual display 61 is varied based on a parameter value selected by the user.
  • Location of an outer zone starting radius 69 shown in visual display 61 is varied based on a parameter selected by the user.
  • Outer zone width determined by location of an outer zone outer radius 68 shown in visual display 61 is varied based on a parameter value selected by the user.
  • the shape of the peripheral zones can be varied by the user by selecting different peripheral values.
  • a selector 63 is used to vary a particular parameter selected in area 62 .
  • the user has selected to vary the tone in a zone as indicated by the corresponding darkened radio button.
  • selector 63 is a slider. The location of the slider varies the pitch of the tone for the selected zone.
  • Selector 63 can be implemented using something other than a slider. For example, instead of a slider, a user can select a discrete value from a list, and so on.
  • FIG. 6 is a flowchart illustrating how parameter adjusting module 20 adjusts parameters in response to user input from user interface 60 shown in FIG. 5 .
  • touch sensitive display detects a touch. A determination is made as to whether the touch was made on selector 63 . If not, nothing is done.
  • Block 72 If the touch was made on selector 63 , in a block 72 , if the touch is removed with no new value being selected, control returns back to block 71 .
  • Block 73 is entered once a new selection is made, for example by moving touch location on a slider.
  • a block 74 a determination is made as to which parameter has been selected in area 62 of user interface 60 , shown in FIG. 5 .

Abstract

Feedback is provided for a virtual keyboard. A virtual key is displayed by a touch-sensitive display. The virtual key has a center zone and a plurality of peripheral zones surrounding the center zone. A first tone is sounded when the virtual key is touched within the center zone. A second tone is sounded when the virtual key is touched in a first peripheral zone. A third tone is sounded when the virtual key is touched in a second peripheral zone. The first tone is different than the second tone. The first tone is different than the third tone.

Description

    BACKGROUND
  • Virtual keyboards can be created on touch sensitive displays by displaying images of physical keys on a flat-screen surface. The virtual keyboards can be configured, for example, in a variety of standardized keyboards (QWERTY, Numeric Keypad) or custom keyboards for various custom applications. The touch-sensitive display detects when an image of a physical key is touched, allowing the input of data based on touch location.
  • SUMMARY
  • In accordance with embodiments of the present invention, feedback is provided for a virtual keyboard. A virtual key is displayed by a touch-sensitive display. The virtual key has a center zone and a plurality of peripheral zones surrounding the center zone. A first tone is sounded when the virtual key is touched within the center zone. A second tone is sounded when the virtual key is touched in a first peripheral zone. A third tone is sounded when the virtual key is touched in a second peripheral zone. The first tone is different than the second tone. The first tone is different than the third tone.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a simplified block diagram of a computing system with a touch sensitive display in accordance with an embodiment of the invention.
  • FIG. 2 shows a simplified virtual keyboard displayed on the touch sensitive display shown in FIG. 1 in accordance with an embodiment of the invention.
  • FIG. 3 is a simplified illustration used to describe how touch of a virtual key on the touch sensitive display shown in FIG. 1 is processed in accordance with an embodiment of the invention.
  • FIG. 4 is a flow diagram illustrating operation of the virtual keyboard shown in FIG. 2 in accordance with an embodiment of the invention.
  • FIG. 5 is an example of a simplified user interface that can be used to vary parameters of virtual keys within a virtual keyboard in accordance with an embodiment of the invention.
  • FIG. 6 is a flow diagram illustrating the process by which the parameters of a virtual key are varied in accordance with an embodiment of the invention.
  • DESCRIPTION OF THE EMBODIMENT
  • FIG. 1 is a simplified block diagram of a computing system 10 with a touch sensitive display 11. A touch controller 12 and a display controller 13 provide an interface between touch sensitive display 11 and a processor 16.
  • An audio device 14 provides sound feedback to a user using a virtual keyboard displayed on touch sensitive display 11. An audio controller 15 provides an interface between audio device 14 and processor 16. The virtual keyboard is implemented using application module 18 running within an operating system 17. A virtual key control module 19 provides control for the virtual keyboard. A parameter adjusting module 20 allows a user to adjust parameters of the virtual keyboard.
  • FIG. 2 shows a simplified virtual keyboard 22 displayed on touch sensitive display 11. Virtual keyboard 22 is composed of virtual keys that can be arranged, for example, in a variety of keyboard patterns such as for a QWERTY keyboard, a numeric keypad, or a custom key arrangement. When touch sensitive display 11 senses that a user has touched a virtual key displayed on touch sensitive display 11, the value for the key is input into computing system 10.
  • One disadvantage that virtual keyboards can have over physical keyboards is the lack of physical feedback. Physical keyboards give tactile feedback to a user indicating to a user location of fingers on the physical keyboard. Virtual keyboards are typically implemented on flat screen surfaces so that it is impractical to provide tactile feedback indicating user location of keys on the flat screen surface.
  • In various embodiments of the present invention, audio feedback is provided to a user to aid the user in correctly positioning fingers on virtual keyboard 22. The audio feedback can be configured so that a user will receive sufficient feedback to allow correct positioning of fingers and so that the user will not receive audio feedback that is too complex or otherwise unhelpful when the user, for example, is typing many words per minute or otherwise pressing multiple keys in a single second.
  • FIG. 3 is used to illustrate how touch location within a virtual key 46 from virtual keyboard 22 is processed to provide audio feedback to a user to aid the user in correctly positioning fingers on virtual keyboard 22.
  • In accordance with embodiments of the present invention, when a user touches a location within virtual key 46, the user will receive feedback based on location within virtual key 46. For example, a touch detected within a center region 40 of virtual key 46 will result in a first assigned audio tone being sounded by audio device 14. A touch detected within one of peripheral regions 41, 42, 43, 44, and 45 will result in an assigned audio tone other than the first audio tone being sounded by audio device 14. For example the audio tone sounded by audio device 14 differs dependent upon which region is the center of a detected touch. The tone can vary from region to region based, for example, on pitch, volume, brightness or duration. The tone can also comprise a harmony of two or more notes, and different harmonies or dissonance between the multiple notes can be sounded by audio device 14 to provide feedback to a user as to which region is the center of a detected touch.
  • Typically, center region 40 will cover more than half the area of virtual key 46. As long as a user touches virtual keys within their center region, the first assigned audio tone will continue to give feedback to a user indicating the user's fingers are properly located over the virtual keyboard. When the user touches a virtual key outside its center region, the alternate audio tone(s) will indicate to the user that the user's finger has touched a virtual key on its outside edge thereby providing feedback to the user that indicates it is necessary to correct positioning of the finger with respect to the virtual key.
  • In the example shown in FIG. 3, there are five regions surrounding center region 40. Each of peripheral regions 41 through 45 can be configured so as to have a distinct tone that will vary from the other regions. For example, the tone varies based on one or more of pitch, volume, duration, harmony or dissonance. The differing tones warn the user that the user has touched a virtual key with a touch centering outside center region 40. The particular variation in tone can indicate to the user whether the touch is in a peripheral region that is left, right, above or below center region 40. This feedback will encourage the user to re-position hands so that in the future virtual keys will be pressed with touches inside center region 40. While FIG. 3 shows five peripheral regions 41 through 45 surrounding center region 40, various embodiments of the invention allow a user to select the number of peripheral regions. For example, a user may find it most helpful to select just two peripheral regions or four peripheral regions to surround center region 40.
  • In one embodiment, when a user's fingers touches both center region 40 and a peripheral region, touch sensitive display can determine in which region the touch is detected and audio device 14 can emit the tone only for the region in which the touch is detected. Alternatively, when a user's finger touches both center region 40 and a peripheral region, audio device 14 can emit the tones for all the regions that are touched.
  • FIG. 4 is a flow diagram illustrating virtual key controller 19 controlling operation of virtual keyboard 22, shown in FIG. 2. In a block 31, touch sensitive display detects a touch. A determination is made as to whether the touch was made within a region of a virtual key within virtual keyboard 22. If not, nothing is done. Depending upon how the keyboard is implemented, the determination of the location of the touch can be made upon the initial touch on the display or upon the release of the touch.
  • If the touch was made on a virtual key within virtual keyboard 22, in a block 32, a touch location inside a virtual key is identified. In a block 33, the particular virtual key touched is identified and identification of the touch virtual key is passed to application module 18 for application processing.
  • In a block 34, a determination is made as to whether the touch was detected in one of the peripheral regions of the virtual key (e.g., one of peripheral regions 41 through 45 of virtual key 46). If not, in a block 35, the tone for the central region of the virtual key (e.g., central region 40 of virtual key 46 shown in FIG. 5) is accessed and stored for play by audio device 14. In a block 36, the stored tone is played by audio device 14, shown in FIG. 1.
  • If in block 34 the touch was detected in one of the peripheral regions of the virtual key (e.g., one of peripheral regions 41 through 45 of virtual key 46), in a block 37, the particular peripheral region receiving the touch is identified. In a block 38, the tone for the particular peripheral region receiving the touch is accessed and stored for play by audio device 14. In a block 36, the stored tone is played.
  • FIG. 5 provides an example of a user interface 60 that can be used to vary parameters for virtual keys within virtual keyboard 22. In an area 62, a user can select a parameter to vary by selecting a corresponding radio button. For example, the following parameters are listed in area 62: set number of zones, set tone pitch in any zone, set zone starting location around circumference, set outer zone staring radius, set outer zone width, and set outer zone shape. In other embodiments, fewer or a greater number of parameters can be listed. For example, a parameter could be included indicating whether a tone with multiple pitches would be sounded when more than one region is touched in a virtual key. Tone duration, harmony and dissonance can also be included as variable parameters in area 62. And so on.
  • A visual display 61 of a virtual key is provided to give a user feedback into results of parameter changes. For example, when a user varies number of zones, the selected number of zones is shown in visual display 61. For example, currently three peripheral zones 65, 66 and 67 are shown. When the user sets a different number of zones, the number selected by the user is reflected by changes to visual display 61.
  • Likewise, location of a zone starting location 64 shown in visual display 61 is varied based on a parameter value selected by the user. Location of an outer zone starting radius 69 shown in visual display 61 is varied based on a parameter selected by the user. Outer zone width determined by location of an outer zone outer radius 68 shown in visual display 61 is varied based on a parameter value selected by the user. The shape of the peripheral zones can be varied by the user by selecting different peripheral values.
  • A selector 63 is used to vary a particular parameter selected in area 62. For example, in area 62, the user has selected to vary the tone in a zone as indicated by the corresponding darkened radio button. To select a particular zone, the user can touch the particular zone on visual display 61. Then the user can use selector 63 to vary the parameter. In the example shown in FIG. 5, selector 63 is a slider. The location of the slider varies the pitch of the tone for the selected zone. Selector 63 can be implemented using something other than a slider. For example, instead of a slider, a user can select a discrete value from a list, and so on.
  • FIG. 6 is a flowchart illustrating how parameter adjusting module 20 adjusts parameters in response to user input from user interface 60 shown in FIG. 5.
  • In a block 71, touch sensitive display detects a touch. A determination is made as to whether the touch was made on selector 63. If not, nothing is done.
  • If the touch was made on selector 63, in a block 72, if the touch is removed with no new value being selected, control returns back to block 71. Block 73 is entered once a new selection is made, for example by moving touch location on a slider. In a block 74, a determination is made as to which parameter has been selected in area 62 of user interface 60, shown in FIG. 5.
  • If in area 62 the radio button for set number of peripheral zones is selected, in a block 81, a new value for the number of peripheral zones, selected in block 72, is stored.
  • If in area 62 the radio button for setting the pitch of zones is selected, in a block 82, a new value for the pitch of zones, selected in block 72, is stored.
  • If in area 62 the radio button for setting zone starting location around circumference is selected, in a block 83, a new value for the starting location around the circumference, selected in block 72, is stored.
  • If in area 62 the radio button for setting the outer zone starting location is selected, in a block 84, a new value for the outer zone starting location, selected in block 72, is stored.
  • If in area 62 the radio button for setting the outer zone width is selected, in a block 85, a new value for the outer zone width, selected in block 72, is stored.
  • If in area 62 the radio button for setting the outer zone shape is selected, in a block 86, a new value for the outer zone shape, selected in block 72, is stored.
  • The foregoing discussion discloses and describes merely exemplary methods and embodiments. As will be understood by those familiar with the art, the disclosed subject matter may be embodied in other specific forms without departing from the spirit or characteristics thereof. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

Claims (20)

1. A method for providing feedback for a virtual keyboard, the method comprising:
displaying a virtual key by a touch-sensitive display, the virtual key having a center zone and a plurality of peripheral zones surrounding the center zone;
sounding a first tone when the virtual key is touched within the center zone;
sounding a second tone when the virtual key is touched in a first peripheral zone; and,
sounding a third tone when the virtual key is touched in a second peripheral zone;
wherein the first tone is different than the second tone; and,
wherein the first tone is different than the third tone.
2. A method as in claim 1 wherein the first tone has a different pitch than the second tone.
3. A method as in claim 1 wherein the first tone has a different duration than the second tone and the third tone.
4. A method as in claim 1 wherein the first tone has a different volume than the second tone and the third tone.
5. A method as in claim 1, additionally wherein when the virtual key is touched within both the center zone and the first peripheral zone, both the first tone and the second tone are sounded.
6. A method as in claim 1, additionally comprising:
changing the number of peripheral zones per virtual key in response to user selections from a user interface.
7. A method as in claim 1, additionally comprising:
changing individual characteristics of each of the first tone, the second tone and the third tone in response to user selections from a user interface.
8. A method as in claim 7, wherein characteristics of the first tone include at least one of a pitch of the first tone, a duration of the first tone, a volume of the first tone, a duration of the first tone, and a brightness of the first tone.
9. A method as in claim 1, additionally comprising:
changing characteristics of the plurality of peripheral zones in response to user selections from a user interface.
10. A method as in claim 9, wherein characteristics of the plurality of peripheral zones include at least one of a number of peripheral zones, circumferential starting location of peripheral zones; starting radius for peripheral zones; radial width of peripheral zones; shape of peripheral zones.
11. A computing device comprising:
a touch-sensitive display;
a virtual keyboard displayed on the touch-sensitive display, the virtual keyboard including:
a plurality of virtual keys, each virtual key having a center zone and a plurality of peripheral zones surrounding the center zone; and,
an audio device;
wherein the audio device sounds a first tone when a virtual key in the plurality of virtual keys is touched within the center zone;
wherein the audio device sounds a second tone when a virtual key in the plurality of virtual keys is touched within a first peripheral zone;
wherein the audio device sounds a third tone when a virtual key in the plurality of virtual keys is touched within a second peripheral zone;
wherein the first tone is different than the second tone; and,
wherein the first tone is different than the third tone.
12. A computing device as in claim 11 wherein the first tone has a different pitch than the second tone and the third tone.
13. A computing device as in claim 11, additionally wherein when the virtual key is touched within both the center zone and the first peripheral zone, the audio device sounds both the first tone and the second tone.
14. A computing device as in claim 11, additionally comprising:
a user interface that allows a user to change the number of peripheral zones per virtual key.
15. A computing device as in claim 11, additionally comprising:
a user interface that allows a user to change individual characteristics of each of the first tone, the second tone and the third tone.
16. A computing device as in claim 11, additionally comprising:
a user interface that allows a user to change individual characteristics of each of the plurality of peripheral zones in response to user selections from a user interface.
17. A computing device as in claim 16, wherein characteristics of the plurality of zones include at least one of a number of peripheral zones, circumferential starting location of peripheral zones; starting radius for peripheral zones; radial width of peripheral zones; shape of peripheral zones.
18. An device for receiving user input, the device comprising:
a touch-sensitive display;
a virtual key displayed on the touch-sensitive display, the virtual key having a center zone and at least one peripheral zones surrounding the center zone;
a user interface that allows a user to change size of the at least one peripheral zone in response to user selections from a user interface; and,
an audio device;
wherein the audio device sounds a first tone when a virtual key in the plurality of virtual keys is touched within the center zone;
wherein the audio device sounds a second tone when a virtual key in the plurality of virtual keys is touched within a first peripheral zone; and
wherein the first tone is different than the second tone.
19. A computing device as in claim 18, wherein the user interface additionally allows a user to change individual characteristics of each of the first tone and the second tone.
20. A computing device as in claim 18, wherein the user interface additionally allows a user to change other characteristics of the at least one peripheral zone in addition to changing size.
US13/037,124 2011-02-28 2011-02-28 Virtual keyboard feedback Abandoned US20120218194A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/037,124 US20120218194A1 (en) 2011-02-28 2011-02-28 Virtual keyboard feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/037,124 US20120218194A1 (en) 2011-02-28 2011-02-28 Virtual keyboard feedback

Publications (1)

Publication Number Publication Date
US20120218194A1 true US20120218194A1 (en) 2012-08-30

Family

ID=46718645

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/037,124 Abandoned US20120218194A1 (en) 2011-02-28 2011-02-28 Virtual keyboard feedback

Country Status (1)

Country Link
US (1) US20120218194A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130104039A1 (en) * 2011-10-21 2013-04-25 Sony Ericsson Mobile Communications Ab System and Method for Operating a User Interface on an Electronic Device
US20130300590A1 (en) * 2012-05-14 2013-11-14 Paul Henry Dietz Audio Feedback
US8947864B2 (en) 2012-03-02 2015-02-03 Microsoft Corporation Flexible hinge and removable attachment
US8952892B2 (en) 2012-11-01 2015-02-10 Microsoft Corporation Input location correction tables for input panels
US9064654B2 (en) 2012-03-02 2015-06-23 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US20150317075A1 (en) * 2012-05-31 2015-11-05 Peiluo Sun Method and device for providing virtual input keyboard
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9304549B2 (en) 2013-03-28 2016-04-05 Microsoft Technology Licensing, Llc Hinge mechanism for rotatable component attachment
WO2016076759A1 (en) * 2014-11-10 2016-05-19 Арташес Валерьевич ИКОНОМОВ Mobile telephone with remote control function
US9360893B2 (en) 2012-03-02 2016-06-07 Microsoft Technology Licensing, Llc Input device writing surface
US9426905B2 (en) 2012-03-02 2016-08-23 Microsoft Technology Licensing, Llc Connection device for computing devices
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
CN107885393A (en) * 2016-09-30 2018-04-06 陈国仁 The two-dimensional location method of virtual input device
US10031556B2 (en) 2012-06-08 2018-07-24 Microsoft Technology Licensing, Llc User experience adaptation
US10107994B2 (en) 2012-06-12 2018-10-23 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US10452832B2 (en) * 2012-07-12 2019-10-22 International Business Machines Corporation Aural cuing pattern based mobile device security
USRE48963E1 (en) 2012-03-02 2022-03-08 Microsoft Technology Licensing, Llc Connection device for computing devices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6169538B1 (en) * 1998-08-13 2001-01-02 Motorola, Inc. Method and apparatus for implementing a graphical user interface keyboard and a text buffer on electronic devices
US6404442B1 (en) * 1999-03-25 2002-06-11 International Business Machines Corporation Image finding enablement with projected audio
US20080150905A1 (en) * 2006-12-21 2008-06-26 Grivna Edward L Feedback mechanism for user detection of reference location on a sensing device
US20080273018A1 (en) * 2004-04-23 2008-11-06 Richard Woolley Method for scrolling and edge motion on a touchpad
US20110316793A1 (en) * 2010-06-28 2011-12-29 Digitar World Inc. System and computer program for virtual musical instruments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6169538B1 (en) * 1998-08-13 2001-01-02 Motorola, Inc. Method and apparatus for implementing a graphical user interface keyboard and a text buffer on electronic devices
US6404442B1 (en) * 1999-03-25 2002-06-11 International Business Machines Corporation Image finding enablement with projected audio
US20080273018A1 (en) * 2004-04-23 2008-11-06 Richard Woolley Method for scrolling and edge motion on a touchpad
US20080150905A1 (en) * 2006-12-21 2008-06-26 Grivna Edward L Feedback mechanism for user detection of reference location on a sensing device
US20110316793A1 (en) * 2010-06-28 2011-12-29 Digitar World Inc. System and computer program for virtual musical instruments

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130104039A1 (en) * 2011-10-21 2013-04-25 Sony Ericsson Mobile Communications Ab System and Method for Operating a User Interface on an Electronic Device
US9852855B2 (en) 2012-03-02 2017-12-26 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US10013030B2 (en) 2012-03-02 2018-07-03 Microsoft Technology Licensing, Llc Multiple position input device cover
USRE48963E1 (en) 2012-03-02 2022-03-08 Microsoft Technology Licensing, Llc Connection device for computing devices
US9304949B2 (en) 2012-03-02 2016-04-05 Microsoft Technology Licensing, Llc Sensing user input at display area edge
US9047207B2 (en) 2012-03-02 2015-06-02 Microsoft Technology Licensing, Llc Mobile device power state
US9064654B2 (en) 2012-03-02 2015-06-23 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9098117B2 (en) 2012-03-02 2015-08-04 Microsoft Technology Licensing, Llc Classifying the intent of user input
US9116550B2 (en) 2012-03-02 2015-08-25 Microsoft Technology Licensing, Llc Device kickstand
US9134808B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Device kickstand
US9134807B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9146620B2 (en) 2012-03-02 2015-09-29 Microsoft Technology Licensing, Llc Input device assembly
US9158384B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Flexible hinge protrusion attachment
US9158383B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Force concentrator
US9176901B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flux fountain
US9176900B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US9946307B2 (en) 2012-03-02 2018-04-17 Microsoft Technology Licensing, Llc Classifying the intent of user input
US9268373B2 (en) 2012-03-02 2016-02-23 Microsoft Technology Licensing, Llc Flexible hinge spine
US9275809B2 (en) 2012-03-02 2016-03-01 Microsoft Technology Licensing, Llc Device camera angle
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9304948B2 (en) 2012-03-02 2016-04-05 Microsoft Technology Licensing, Llc Sensing user input at display area edge
US9904327B2 (en) 2012-03-02 2018-02-27 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US10963087B2 (en) 2012-03-02 2021-03-30 Microsoft Technology Licensing, Llc Pressure sensitive keys
US8947864B2 (en) 2012-03-02 2015-02-03 Microsoft Corporation Flexible hinge and removable attachment
US9426905B2 (en) 2012-03-02 2016-08-23 Microsoft Technology Licensing, Llc Connection device for computing devices
US9360893B2 (en) 2012-03-02 2016-06-07 Microsoft Technology Licensing, Llc Input device writing surface
US9411751B2 (en) 2012-03-02 2016-08-09 Microsoft Technology Licensing, Llc Key formation
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9460029B2 (en) 2012-03-02 2016-10-04 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9465412B2 (en) 2012-03-02 2016-10-11 Microsoft Technology Licensing, Llc Input device layers and nesting
US9619071B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices
US9618977B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Input device securing techniques
US9678542B2 (en) 2012-03-02 2017-06-13 Microsoft Technology Licensing, Llc Multiple position input device cover
US9710093B2 (en) 2012-03-02 2017-07-18 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9766663B2 (en) 2012-03-02 2017-09-19 Microsoft Technology Licensing, Llc Hinge for component attachment
US20130300590A1 (en) * 2012-05-14 2013-11-14 Paul Henry Dietz Audio Feedback
US9348605B2 (en) 2012-05-14 2016-05-24 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes human interface device (HID) data via intermediate processor
US9959241B2 (en) 2012-05-14 2018-05-01 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
US8949477B2 (en) 2012-05-14 2015-02-03 Microsoft Technology Licensing, Llc Accessory device architecture
US20150317075A1 (en) * 2012-05-31 2015-11-05 Peiluo Sun Method and device for providing virtual input keyboard
US10031556B2 (en) 2012-06-08 2018-07-24 Microsoft Technology Licensing, Llc User experience adaptation
US10107994B2 (en) 2012-06-12 2018-10-23 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US10452832B2 (en) * 2012-07-12 2019-10-22 International Business Machines Corporation Aural cuing pattern based mobile device security
US8952892B2 (en) 2012-11-01 2015-02-10 Microsoft Corporation Input location correction tables for input panels
US9304549B2 (en) 2013-03-28 2016-04-05 Microsoft Technology Licensing, Llc Hinge mechanism for rotatable component attachment
WO2016076759A1 (en) * 2014-11-10 2016-05-19 Арташес Валерьевич ИКОНОМОВ Mobile telephone with remote control function
CN107885393A (en) * 2016-09-30 2018-04-06 陈国仁 The two-dimensional location method of virtual input device

Similar Documents

Publication Publication Date Title
US20120218194A1 (en) Virtual keyboard feedback
JP5782133B2 (en) Dynamic placement on-screen keyboard
US9122318B2 (en) Methods of and systems for reducing keyboard data entry errors
US8799803B2 (en) Configurable input device
US9965179B2 (en) Adaptive virtual keyboard
US20090153495A1 (en) Input method for use in an electronic device having a touch-sensitive screen
JP2006127488A (en) Input device, computer device, information processing method, and information processing program
US20120204258A1 (en) Password input method based on touch screen
US20110102335A1 (en) Input device, input method, program, and storage medium
JP2004355606A (en) Information processor, information processing method, and program
JP5556398B2 (en) Information processing apparatus, information processing method, and program
US20090167715A1 (en) User interface of portable device and operating method thereof
EP2926220A1 (en) Adaptive virtual keyboard
US9158457B2 (en) Adjustment of multiple user input parameters
US8363007B2 (en) Method and touchpad interface device using light for displaying level
JP2008065504A (en) Touch panel control device and touch panel control method
US9405439B2 (en) Audio signal controller
JP2009271771A (en) Driving operation device
CN101470575B (en) Electronic device and its input method
JP2008305339A (en) Operation time measuring instrument and method, skill level determining device and method, and program
JP5414134B1 (en) Touch-type input system and input control method
TWI511021B (en) Operation method for virtual keyboard
JP2023535212A (en) Adaptable touch screen keypad with dead zone
TWI410860B (en) Touch device with virtual keyboard and method of forming virtual keyboard thereof
CN104111797B (en) A kind of information processing method and electronic equipment

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION