WO2007128035A1 - Systèmes et procédés permettant d'interfacer un utilisateur avec un écran tactile - Google Patents

Systèmes et procédés permettant d'interfacer un utilisateur avec un écran tactile Download PDF

Info

Publication number
WO2007128035A1
WO2007128035A1 PCT/AU2007/000564 AU2007000564W WO2007128035A1 WO 2007128035 A1 WO2007128035 A1 WO 2007128035A1 AU 2007000564 W AU2007000564 W AU 2007000564W WO 2007128035 A1 WO2007128035 A1 WO 2007128035A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
menu
primary
tertiary
input regions
Prior art date
Application number
PCT/AU2007/000564
Other languages
English (en)
Inventor
Ian Andrew Maxwell
Original Assignee
Rpo Pty Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2006902241A external-priority patent/AU2006902241A0/en
Application filed by Rpo Pty Limited filed Critical Rpo Pty Limited
Publication of WO2007128035A1 publication Critical patent/WO2007128035A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to interfacing a user with an electronic device, and more particularly to systems and methods for interfacing a user with a touch-screen.
  • Embodiments of the invention have been particularly developed for providing a touch-actuated interface for entering alphanumeric information on a portable electronic device, and the present disclosure is primarily focused accordingly.
  • the invention is described hereinafter with particular reference to such applications, it will be appreciated that the invention is applicable in broader contexts.
  • One aspect of the present invention provides a method for interfacing a user with a touch-screen, the method including the steps of:
  • the representation is indicative of a plurality of distinct alphanumeric characters.
  • the associated primary command is related to a plurality of secondary commands respectively corresponding to at least one of the distinct alphanumeric characters.
  • the relationship between primary and secondary commands is affected by the operation of a predictive text protocol, such that for the at least one primary input region, the associated primary command is relatable to a plurality of secondary commands respectively corresponding to predicted words.
  • the secondary menu shares a common origin with the primary menu.
  • the secondary menu has an angular divergence of between 50% and 200% of an angular divergence of the touch-selected primary input region.
  • the secondary menu has an angular divergence of between 100% and 150% of an angular divergence of the touch-selected primary input region.
  • the secondary menu has an angular divergence approximately equal to an angular divergence of the touch-selected primary input region.
  • the on-screen positioning of the primary and secondary menus varies.
  • the variation includes movement substantially along a vector defined by a central radius of the touch-selected primary input region having a direction towards an origin of the primary menu.
  • the on-screen scaling of the primary and secondary menus varies.
  • the primary input regions correspond to keys on a twelve-key telephone keypad.
  • One embodiment provides a method including the further steps of:
  • step (e) following step (d), closing the secondary menu.
  • One embodiment provides a method including the further steps of:
  • a representation of a tertiary menu (g) displaying on the screen a representation of a tertiary menu, the tertiary menu radially extending substantially as an annular sector from the periphery of the secondary menu substantially adjacent the touch-selected secondary input region, the tertiary menu including one or more tertiary input regions, the one or more tertiary input regions being respectively associated with the one or more tertiary commands, each tertiary input region displaying a representation indicative of its respective tertiary command.
  • One embodiment provides a method including the steps of:
  • One embodiment provides a method including the steps of:
  • One embodiment provides a method including the steps of:
  • step (1) closing the secondary menu.
  • step (1) closing the secondary menu.
  • step (o) displaying on the screen a representation of a tertiary menu, the tertiary menu radially extending substantially as an annular sector from the periphery of the secondary menu displayed at step (c), the tertiary menu including one or more tertiary input regions, the one or more tertiary input regions being respectively associated with one or more tertiary commands each respectively indicative of a predicted word, each tertiary input region displaying a representation indicative of its respective predicted word.
  • One embodiment provides a method including the steps of: (p) being responsive to a touch-selection of one of the tertiary input regions having an associated tertiary command indicative of a predicted word for inputting that predicted word;
  • step (q) following step (p), closing the tertiary and secondary menus.
  • a second aspect of the invention provides a method for interfacing a user with a touch-screen, the method including the steps of:
  • One embodiment provides a method including the steps of:
  • step (e) following step (d), closing the secondary menu.
  • One embodiment provides a method wherein the primary input regions are defined by the set of primary input regions that corresponds to the keys on a 12-key telephone keypad.
  • a third aspect of the invention provides a computer-readable carrier medium carrying a set of instructions that when executed by one or more processors cause the one or more processors to carry out a method according to the first or second aspect.
  • a fourth aspect of the invention provides a device including: a touch-screen; and a processor coupled to the touch-screen for carrying out a method according to the first or second aspect.
  • a fifth aspect of the invention provides a method for interfacing a user with a touchscreen, the method including the steps of:
  • FIG. 1 schematically illustrates a portable electronic device according to one embodiment.
  • FIG. 2 schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2A schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2B schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2C schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2D schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2E schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2F schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2G schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2H schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 21 schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2J schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 2K schematically illustrates an exemplary touch-screen display according to one embodiment.
  • FIG. 3 schematically illustrates a method according to one embodiment.
  • FIG. 3 A schematically illustrates a method according to one embodiment.
  • FIG. 3B schematically illustrates a method according to one embodiment.
  • FIG. 3 C schematically illustrates a method according to one embodiment.
  • FIG. 3D schematically illustrates a method according to one embodiment.
  • some embodiments provide for an array of conventional numerical keys to be graphically represented as a primary menu on a touch-screen of a cellular phone or PDA.
  • the graphically represented kej ⁇ s are arranged as sectors or annular sectors in a contiguous array around a central origin or region.
  • a user touch-selects one of the keys and is provided with a secondary menu for allowing selection of a particular alphanumeric character associated with the selected numerical key.
  • This association is optionally based on a protocol such as ETSI ETS 300 640 or ITU-T Recommendation E.161.
  • the secondary menu or a similar tertiary menu, is used to provide additional predictive text functionality.
  • FIG. 1 schematically illustrates an exemplary portable electronic device 101 according to one embodiment.
  • Device 101 includes a processor 102 coupled to a memory module 103 and a touch-screen 104.
  • Processor 102 is also coupled to other manual inputs 105, such as physical buttons, and other not-shown components, which in some cases define or contribute to the purpose of device 101.
  • device 101 is an imaging phone, and the processor is additionally coupled to a GSM communications module and an imaging CCD.
  • Memory module 103 maintains software instructions 106 which, when executed on processor 102, allow device 101 to perform various methods and functionalities described herein. For example, on the basis of software instructions 106, device 101 performs methods for interfacing a user with a touch-screen or for displaying representations on a touch-screen. For example, on the basis of the software instructions, processor 102 causes graphical representations to be displayed on touch-screen 104, and is responsive to coordinate information indicative of touching of touch-screen 104.
  • Portable electronic device as used herein should be read broadly. In the context of device 101, it refers to a generic device having components and functionalities described herein, without limitation to additional functionalities.
  • Portable electronic devices present in various embodiments of the present invention include, but are not limited to:
  • Portable communications devices That is, substantially any portable electronic device including a communications module, such as a GSM or CDMA module. Common examples include cellular phones, “smartphones” and so on.
  • Portable computing devices such as PDAs, Ultra Mobile Personal Computers (UMPCs), laptop computers, tablet computers, and thin-client remote controllers.
  • Personal entertainment devices such as gaming devices, media players (including audio and/or video players), imaging devices (such as digital still and/or video cameras) and the like.
  • portable should be read broadly to imply a degree of portability, hi this way, "handheld” devices are considered to be a subset of “portable” devices. Furthermore, some embodiments are implemented in relation to non-portable devices, such as touch-screen information kiosks.
  • touch-screen should be read broadly to encompass any components or group of interrelated components that provide a display for displaying graphical representations and one or more sensors for identifying a location at which the display is touched.
  • the sensors are responsive to pressure being exerted on a substrate (or pressure being exerted on a substrate and released), whereas in other cases the sensors are responsive to movement across a barrier overlying the screen, for example a barrier defined by one or more light paths.
  • the touch-screen There is no strict requirement for the touch-screen to be responsive to direct touching of the display, and in some situations it may be responsive to touching or movement at a location functionally associated with the display, such as a proximal window or a separate touch pad.
  • the touch-screen includes additional components, such as software and hardware.
  • touch-screen should be read broadly to include substantially any manner for interacting with a "touch-screen". This includes both physical contact with a substrate, and movement through a defined barrier (although this movement does not in all cases necessarily result in any physical touching of a substrate). That is, the system may be responsive to a "near touch". In some embodiments the touching is effected by direct human touching (such as the use of a finger or thumb) or indirect human touching (for example by use of a stylus). Touching includes, in various embodiments, tapping and lifting on a region of the touch-screen, double tapping on a region of the touch-screen, or sliding and stopping on a region of the touch-screen.
  • touch-screen 104 is schematically illustrated as a display screen for displaying graphical representations.
  • Processor 102 on the basis of software instructions 106, instructs touch-screen 104 to display such representations.
  • the display screen includes an LCD, plasma, CRT or other display.
  • the display is pixel based. That is, the display includes an array of pixels that are actuated and/or colored under instruction of processor 102, thereby to provide the representations.
  • Some representations displayed on the touch-screen define input regions associated with respective commands.
  • the processor is responsive to touching of the screen at a location overlying a given one of these input regions for performing a functionality corresponding to the relevant command.
  • touching results in coordinate information being provided to processor 102, and processor 102 looks to match this coordinate information with information indicative of the representations on-screen at the time the coordinate information was generated, or the time at which the touching occurred, as well as with any associated commands.
  • touch-screen 104 provides an input area 110, text editor area 111, and other input area 112. These areas are considered for the sake of explanation only, and should not be regarded as limiting in any way, particularly in relation to the relative sizes and positioning of these areas.
  • the input area defines substantially the whole screen.
  • the input area is an overlay on the text editor area.
  • the general intention of the present illustration is to show device 101 in an exemplary operational state where it is configured for authoring of a text-based message.
  • a user interacts by way of touch with graphical representations shown in the input area to enter alphanumeric information that subsequently appears in the text editor area.
  • the other input area provides associated commands, such as commands relating to the formatting and/or the delivery of text entered into the text editor area as an email or other text-based message.
  • FIG. 2 through FIG. 21 show various exemplary representations displayable in input area 110.
  • the general notion is that a user interacts with touch-screen 104 at input area 110 for inputting text-based data into text editor area 111. These representations are discussed in detail below.
  • FIG. 2 shows a representation including a circular primary menu 200.
  • Menu 200 includes a plurality of primary input regions 201 to 212, corresponding to the twelve keys of a conventional telephone numerical keypad (numerals “0" to “9", plus “*" and "#").
  • Input regions 201 to 212 are arranged as annular sectors in a contiguous array. Each input region is associated with a respective primary command, and displays a representation indicative of its respective primary command, for example a numeral and a selection of letters.
  • the primary input regions corresponding to the numerals "0" to “9" are arranged other than in a sequential clockwise manner.
  • contiguous should be read broadly to cover situations where the input regions are spaced apart and therefore not directly adjacent one another.
  • radial neutral zones separate the input regions, these neutral zones having no associated command.
  • the general intention is to create a barrier between input regions, and thereby reduce the risk of inadvertent selection of an unwanted input region. An example is provided in FIG. 2K.
  • each primary input region is intrinsically related to a numeral (or "*" or "#"), with the twenty-six letters of the Roman alphabet distributed amongst the input regions. That is, for a selection of the primary input regions, the representations shown are indicative of a plurality of distinct alphanumeric characters.
  • the "1", "0", "*" and "#” inputs are associated with special functions rather than letters, these specific functions optionally including symbols such as punctuation, currency or "smilies", or character input modifiers such as "upper case”. In some embodiments these special functions are programmable to perform various other purposes.
  • programmable it is meant that the associated command is not fixed, and is variable at the discretion of a user. For example, a user is permitted to select the functionality of a given input region from a list of possible functionalities.
  • input regions are programmable not only in terms of functionality, but also in terms of size, shape, location, and circumstances under which they are displayed on the screen.
  • additional input regions are provided in area 110, and in some cases these are user-programmable to perform various functionalities not specifically considered herein.
  • menu 200 is depicted as circular, in other embodiments alternate shapes may be used, such as shapes that are able to be defined by a contiguous array of sub-regions. Such shapes are considered to be “substantially circular", and include polygons. In some embodiments a polygon is used having a number of sides equal to an integral fraction of the number of primary input regions. For example, a hexagon is conveniently used as an alternative in the example of FIG. 2. In some embodiments triangles or squares are used, or irregular shapes such as brand logos.
  • annular sector is used to describe a shape that has a first edge conformable to a substantially circular object (such as a circle or hexagon, in the case of the latter optionally spanning multiple sides of the hexagon such that the first edge includes a plurality of sides), a pair of sides extending from this first edge substantially along radial paths of the substantially circular object, and a second edge connecting the pair of sides at their respective ends distal from the first edge, this second edge being either straight, curved, or defined by a plurality of sides.
  • the second edge is a larger version of the first edge.
  • An annular sector has an "angular divergence", defined as the angle at which the pair of sides diverge from one another. In the event that the sides are parallel, this angle is zero. Otherwise, the angular divergence is conveniently measurable by following the two sides towards a common converging origin, and measuring the angle at this origin.
  • primary input regions 201 to 212 are arranged as annular sectors around a central region 215.
  • Central region 215 optionally defines an additional input region, such as a "shift” input, "space” input, “delete” input, or the like.
  • it defines a plurality of input regions, for example half for "space” and half for “delete”.
  • it defines a "neutral zone” where a user can rest their finger without affecting any input.
  • it performs a user-programmable functionality.
  • central region there is no central region, and as such primary input regions 201 to 212 are arranged as sectors rather than annular sectors.
  • primary input regions 201 to 212 are arranged as sectors rather than annular sectors.
  • a central region provides distinct advantages, such as reducing the likelihood of a user inadvertently selecting an undesired input by touching close to the centre.
  • FIG. 2A shows a representation including a secondary menu 220.
  • the secondary menu radially extends substantially as an annular sector from primary menu 200.
  • the secondary menu includes secondary input regions 221 to 223, respectively corresponding to the letters of which the adjacent primary input region is indicative.
  • FIG. 3 and FIG. 3A illustrate exemplary methods for progressing between the representations of FIG. 2 and FIG. 2 A. These are discussed below.
  • FIG. 3 shows a general method 300.
  • Step 301 includes displaying a primary menu comprising one or more primary input regions
  • step 302 includes receiving data indicative of touch-selection of a primary input region
  • step 303 includes identifying one or more secondary commands related to the primary command associated with the selected primary input region
  • step 304 includes displaying a secondary menu having input regions associated with identified secondary commands.
  • the primary command associated with the selected primary input region is indicative of one or more secondary commands or, in other cases, of an instruction to display a secondary menu representative of those one or more secondary commands.
  • the secondary menu is associable at a viewable level with the selected primary input region. For example, where the primary input region includes a group of representations, and the secondary menu includes secondary input regions each including a respective one of those representations.
  • FIG. 3 A provides a more specific method 310 5 which relates to the example of FIG. 2.
  • Step 311 includes displaying primary menu 200 comprising one or more primary input regions
  • step 312 includes receiving data indicative of a touch-selection of primary input region, essentially being a user-selection of one of the primary input regions
  • step 313 again includes identifying one or more secondary commands related to the primary command associated with the selected primary input region.
  • the associated input command is related to a plurality of secondary commands respectively corresponding to the distinct alphanumeric characters represented by the relevant primary input region, or alternate functions represented by the relevant primary input region.
  • Secondary input regions displaying distinct characters are associated with a command to allow input of character commands for those characters. In the example illustrated in FIG.
  • primary input region 202 representing "2", “A”, “B” and “C” is touch- selected
  • secondary menu 220 including secondary input regions 221, 222 and 223 associated with input commands for the letters "A", "B” and “C” is displayed.
  • the character associated with that region is "inputted” - for instance it appears in an editor field (such as text editor area 111).
  • text editor area 111 allows a previously inputted word or character to be selected by touching that word or character.
  • touch- • interaction allows a user to manipulate a cursor in the text editor area. For example, the user taps at a location within text editor area 111 to place the cursor at that location, or double- taps on an existing word to select that word.
  • Input area 110 is then used to input text and/or make modifications to existing text.
  • the secondary menu is closed responsive to either or both of the inputting of a character or the touch-selection of a different primary input region.
  • secondary menu 220 includes an additional secondary input region for a numeral associated with the relevant primary input region ("2" in the case of primary input region 202).
  • the causal primary input region becomes associated with a command to input that numeral.
  • a user touches primary input region 202 twice to enter the numeral "2".
  • the secondary menu 220 shares a common origin with the primary menu 200. That is, the sides of the secondary menu effectively diverge from an origin at the centre of the primary menu.
  • the secondary menu radially extends from a location adjacent and centered on the primary input region which, when selected, results in the display of that secondary menu.
  • the secondary input regions are located proximal the location of the most recent touch- selection.
  • the secondary menu preferably has an angular divergence of between 50% and 200% of the angular divergence of the touch-selected primary input region, or more preferably between 100% and 150% of the angular divergence of the touch-selected primary input region.
  • the secondary menu has an angular divergence approximately equal to the angular divergence of the touch-selected primary input region.
  • the approach is to consider a variation between the angular divergence of the primary input region and angular divergence of a hypothetical annular sector sharing a common origin with the primary menu and meeting the periphery of the primary menu at the same locations as the secondary menu. It will be appreciated that this is an equivalent approach.
  • the on-screen positioning of the primary and secondary menus varies in relation to a predefined origin. For example, where these menus share a common origin, that origin is shifted along a vector defined by a central radius of the touch-selected primary input region, in a direction towards the origin of the primary menu so as to present the secondary menu at a more central region of area 110.
  • this shifting essentially moves a portion of the primary menu to an off-screen location. In other words, a portion of the primary menu is not rendered on-screen for a period of time while the secondary menu is displayed.
  • the shifting is displayed by way of an animation at a rate of between two and thirty frames per second.
  • the on-screen positioning of the primary and secondary menus varies in terms of both location and scale.
  • the scale is increased so as to provide a user with a larger (and therefore easier to see) secondary menu.
  • the secondary menu closes and the input area returns to the configuration shown in FIG. 2 so that a further primary input can be selected.
  • predictive text functionalities are provided.
  • FIG. 2C where a tertiary menu 230 radially extends from secondary menu 220, this tertiary menu including tertiary input regions 231 to 236 each being associated with an input command for a word identified by a predictive text protocol.
  • Representations of the words themselves are graphically displayed in the tertiary input regions, and a user either touch- selects one of the tertiary input regions to input a word, or a secondary input region to input a single character.
  • a scale/location variation is applied to make the tertiary menu 230 easier to view.
  • FIG. 3B illustrates a method 320 for displaying a tertiary menu according to one embodiment.
  • the method commences with steps 311 to 314 described above.
  • Step 315 includes performing a predictive text analysis for identifying one or more predicted words, for example using the "T9" protocol. For example, previous inputs defining a partial word are analyzed to determine one or more complete words formable on the basis of the existing partial word and characters corresponding to the most recently selected primary input region.
  • Step 316 includes identifying the highest probability predicted words.
  • words are hierarchically identified in accordance with the perceived likelihood of desirability, for example using a look-up table that is either permanent or updatable based on historical usage.
  • a look-up table that is either permanent or updatable based on historical usage.
  • there will be a limit to the number of tertiary input regions containable in a tertiary menu for example based on text size and angular divergence constraints. In such a case, only a selection of the total list of possible words is identified, the selection including the highest probability predicted words (those predicted words having the highest perceived likelihood of desirability).
  • Step 317 includes displaying a tertiary menu, such as tertiary menu 230, having tertiary input regions for these identified highest probability predicted words.
  • FIG. 2E shows another example of predictive text input.
  • predicted words are provided alongside individual characters in a secondary menu 240.
  • This secondary menu includes character inputs 221 to 223, plus predicted word inputs 241 to 243.
  • predicted words are identified by way of a predictive text protocol. A user is permitted to touch-select one of the predicted word input regions to input the relevant word, or one of the character input regions to input a single character.
  • a scale/positioning variation is applied to make the secondary menu 240 easier to view.
  • the number of predicted word inputs for an embodiment such as FIG. 2E varies between instances. For example, in the interests of menu clarity, the number of predicted word inputs is limited to between zero and five, with only the highest probability predicted words being assigned predicted word input regions.
  • FIG. 3C shows an exemplary method 330 for administering predictive text in a secondary menu, the method commencing with steps 311 to 315 described above.
  • Step 321 then includes determining whether the number of high-probability predicted words is less than (or equal to) a predetermined threshold.
  • each identified predicted word is provided a probability rating that identifies the perceived likelihood of that word being desired by the user. Only identified words having a probability rating greater than a certain threshold are considered for secondary menu inclusion.
  • predicted word inputs are only displayed in a secondary menu in the event that a relatively small threshold number of high-probability predicted words are identified, this threshold number being, in various embodiments, between one and five.
  • the threshold number is three, hi the event that the number of high probability predicted words is greater than the threshold, the method progresses to step 322 where a secondary menu is displayed having input regions for identified letters/symbols only. Otherwise, the method progresses to step 323, where a secondary menu is displayed having input regions for identified letters/symbols as well as the high probability predicted words.
  • predicted words are displayable in both secondary and tertiary menus, for example by combining methods 320 and 330.
  • FIG. 2 J only predicted words are provided in a secondary menu 280.
  • the input regions of a tertiary menu are selected in response to the touch-selection of a secondary input region. For example, following touch-selection of one of the secondary input regions, one or more tertiary commands related to the secondary command associated with the touch-selected secondary input region are, in some embodiments, identified and subsequently displayed on the screen in a tertiary menu including one or more tertiary input regions respectively associated with the one or more tertiary commands.
  • An example of this approach is provided by method 340 of FIG. 3D, which is described below by reference to the screen display of FIG. 2G.
  • FIG. 2G provides an example of where a tertiary menu is provided to allow convenient selection of alternate letters/symbols, such as language-specific letters like ⁇ , e, a, a, ⁇ .
  • a user touch-selects a character in a secondary menu and, in the event that there are alternate letters/symbols related to that letter (in a database or other information repository), a tertiary menu 250 is provided for the alternate letters/symbols.
  • these alternate letters/symbols are not graphically represented in the secondary menu.
  • Method 340 includes steps 311 to 314 described above.
  • Step 341 then includes receiving data indicative of touch-selection of a letter/symbol in a secondary menu, step 342 includes identifying alternate letters/symbols related to the touch-selected letter/symbol, and step 343 includes displaying a tertiary menu having input regions for the identified alternate letters/symbols.
  • the primary menu is used without secondary or tertiary menus.
  • the touch-screen displays a primary menu as discussed previously, this menu including a set of primary input regions that correspond to keys on a 12-key telephone keypad.
  • This primary menu is used to allow convenient user input of text- based data in accordance with a predictive text protocol, such as T9.
  • a predictive text protocol such as T9.
  • the processor subsequently provides a data packet indicative of the one or more characters to a predictive text module.
  • the predictive text module looks for predicted words formable by one or more of these data packets as sequentially arranged. In the case that a given data packet defines the commencement of a word, the predictive text module identifies none or more predicted words formable from the one or more characters of the data packet. Otherwise, if a word has already been commenced by previous inputs (that is, the data packet in question defines a portion of a previously commenced word defined by one or more preceding data packets, these preceding data packets also each being indicative of a respective one of more characters), the predictive text module identifies none or more predicted words formable from the one or more characters of the present data packet in combination with the respective one or more characters of the one or more preceding data packets.
  • the user is allowed, for example by options presented via the touch screen, to select between identified predicted words (assuming one or more were identified). In some embodiments the selection of a word is achieved via the primary menu, whilst in other embodiments it is achieved by other means, such as options provided within the text editor region or elsewhere. If the user selects one of these predicted words, that word is inputted in the text editor region. Alternately, the user is permitted to touch select another primary input region (which may be the same as the one previously selected) to continue authoring the present word.
  • the embodiments considered herein have been predominantly described by reference to the Roman alphabet, it will be appreciated that other embodiments are able to be implemented for handling Asian language characters (be they alphabetic or pictographic), or other non-Roman characters. Those with an understanding of such languages will readily adapt the general structural framework described herein to those languages. For example, in some embodiments the primary input regions provide building blocks for the creation of more complex characters or symbols.
  • processor may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a "computer” or a “computing machine” or a “computing platform” may include one or more processors.
  • the methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein.
  • Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included.
  • a typical processing system that includes one or more processors.
  • Each processor may include one or more of a central processing unit (CPU), a graphics processing unit, and a programmable DSP unit.
  • the processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or a dynamic RAM, and/or ROM.
  • a bus subsystem may be included for communicating between the components.
  • the processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., an liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth.
  • the term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit.
  • the processing system in some configurations may include a sound output device, and a network interface device.
  • the memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein.
  • computer-readable code e.g., software
  • the software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system.
  • the memory and the processor also constitute computer-readable carrier media carrying computer-readable code.
  • a computer-readable carrier medium may form, or be included, in a computer program product.
  • the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment.
  • the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment.
  • the one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • machine or “device” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program, that are for execution on one or more processors, e.g., one or more processors that are part of a building management system.
  • a computer-readable carrier medium carrying computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method.
  • aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
  • the present invention may take the form of a carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
  • the software may further be transmitted or received over a network via a network interface device.
  • the carrier medium is shown in an exemplary embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention.
  • a carrier medium may take many forms, including but not limited to, non- volatile media, volatile media, and transmission media.
  • Non- volatile media include, for example, optical, magnetic, and magneto-optical disks.
  • Volatile media include dynamic memory, such as main memory.
  • Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • carrier medium shall accordingly be taken to include, but not be limited to, solid-state memories, a computer product embodied in optical or magnetic media, a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that when executed implement a method, a carrier wave bearing a propagated signal detectable by at least one processor of the one or more processors and representing a set of instructions that when executed implement a method, and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing a set of instructions.
  • some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function.
  • a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method.
  • an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
  • any one of the terms “comprising”, “comprised of or “which comprises” is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term “comprising”, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression “a device comprising A and B” should not be limited to devices consisting only of elements A and B.
  • Any one of the terms “including” or “which includes” or “that includes” as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others.
  • “including” is synonymous with and means “comprising”.
  • Coupled when used in the claims, should not be interpreted as being limitative to direct connections only. Where the terms “coupled” or “connected”, along with their derivatives, are used, it should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression “a device A coupled to a device B” should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. Rather, it means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled” may mean that two or more elements are either in direct physical or remote (e.g.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne des systèmes et des procédés permettant d'interfacer un utilisateur avec un écran tactile. D'un point de vue général, certains modes de réalisation proposent un ensemble de touches numériques classiques devant être représentées graphiquement en tant que menu principal sur un écran tactile d'un téléphone cellulaire ou d'un PDA. Les touches représentées graphiquement sont disposées en secteurs ou secteurs annulaires dans un arrangement contigu autour d'une origine centrale. Pour fournir une entrée basée sur du texte (par exemple dans le processus de composer un message texte ou un email), un utilisateur sélectionne tactilement l'une des touches, et se voit proposer un menu secondaire permettant une sélection d'un caractère alphanumérique associé à la touche numérique sélectionnée. Cette association est de façon facultative basée sur un protocole tel qu'ETSI ETS 300 640 ou la recommandation de l'UIT-T E.161. Dans certains modes de réalisation, le menu secondaire ou un menu tertiaire analogue est utilisé pour proposer une fonctionnalité supplémentaire de prédiction de texte.
PCT/AU2007/000564 2006-05-01 2007-04-30 Systèmes et procédés permettant d'interfacer un utilisateur avec un écran tactile WO2007128035A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2006902241A AU2006902241A0 (en) 2006-05-01 Touch input method and apparatus
AU2006902241 2006-05-01

Publications (1)

Publication Number Publication Date
WO2007128035A1 true WO2007128035A1 (fr) 2007-11-15

Family

ID=38649738

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2007/000564 WO2007128035A1 (fr) 2006-05-01 2007-04-30 Systèmes et procédés permettant d'interfacer un utilisateur avec un écran tactile

Country Status (3)

Country Link
US (1) US20070256029A1 (fr)
TW (1) TW200821904A (fr)
WO (1) WO2007128035A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD721084S1 (en) 2012-10-15 2015-01-13 Square, Inc. Display with graphic user interface
WO2015188011A1 (fr) * 2014-06-04 2015-12-10 Quantum Interface, Llc. Environnement dynamique pour l'affichage et l'interaction entre objets et attributs
US10503359B2 (en) 2012-11-15 2019-12-10 Quantum Interface, Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same
US11205075B2 (en) 2018-01-10 2021-12-21 Quantum Interface, Llc Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same
US11226714B2 (en) 2018-03-07 2022-01-18 Quantum Interface, Llc Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects
US11775074B2 (en) 2014-10-01 2023-10-03 Quantum Interface, Llc Apparatuses, systems, and/or interfaces for embedding selfies into or onto images captured by mobile or wearable devices and method for implementing same

Families Citing this family (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8225231B2 (en) 2005-08-30 2012-07-17 Microsoft Corporation Aggregation of PC settings
JP4899991B2 (ja) * 2007-03-30 2012-03-21 富士ゼロックス株式会社 表示装置及びプログラム
US8839123B2 (en) * 2007-11-19 2014-09-16 Red Hat, Inc. Generating a visual user interface
KR101382433B1 (ko) * 2007-12-03 2014-04-08 삼성전자주식회사 휴대기기를 위한 모듈 기반 구동 장치 및 방법
DE102007058085A1 (de) * 2007-12-03 2009-06-04 Robert Bosch Gmbh Verfahren zur Anordnung von drucksensitiven Bereichen auf einer drucksensitiven Anzeigevorrichtung
JP2009169456A (ja) 2008-01-10 2009-07-30 Nec Corp 電子機器、該電子機器に用いられる情報入力方法及び情報入力制御プログラム、並びに携帯端末装置
WO2009086836A1 (fr) * 2008-01-11 2009-07-16 Danmarks Tekniske Universitet Dispositif a effleurement
US7966564B2 (en) * 2008-05-08 2011-06-21 Adchemy, Inc. Web page server process using visitor context and page features to select optimized web pages for display
CN100576161C (zh) * 2008-06-06 2009-12-30 中国科学院软件研究所 一种基于笔倾角信息的饼形菜单选择方法
US8754855B2 (en) * 2008-06-27 2014-06-17 Microsoft Corporation Virtual touchpad
US8769427B2 (en) 2008-09-19 2014-07-01 Google Inc. Quick gesture input
US8385952B2 (en) 2008-10-23 2013-02-26 Microsoft Corporation Mobile communications device user interface
US20100107100A1 (en) 2008-10-23 2010-04-29 Schneekloth Jason S Mobile Device Style Abstraction
US8411046B2 (en) 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
US8326358B2 (en) 2009-01-30 2012-12-04 Research In Motion Limited System and method for access control in a portable electronic device
US8355698B2 (en) 2009-03-30 2013-01-15 Microsoft Corporation Unlock screen
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US20100287468A1 (en) * 2009-05-05 2010-11-11 Emblaze Mobile Ltd Apparatus and method for displaying menu items
US20100293457A1 (en) * 2009-05-15 2010-11-18 Gemstar Development Corporation Systems and methods for alphanumeric navigation and input
US9436380B2 (en) 2009-05-19 2016-09-06 International Business Machines Corporation Radial menus with variable selectable item areas
US8269736B2 (en) 2009-05-22 2012-09-18 Microsoft Corporation Drop target gestures
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US20100313168A1 (en) * 2009-06-05 2010-12-09 Microsoft Corporation Performing character selection and entry
US9043718B2 (en) * 2009-06-05 2015-05-26 Blackberry Limited System and method for applying a text prediction algorithm to a virtual keyboard
US8219930B2 (en) * 2009-06-26 2012-07-10 Verizon Patent And Licensing Inc. Radial menu display systems and methods
KR20110018075A (ko) * 2009-08-17 2011-02-23 삼성전자주식회사 휴대용 단말기에서 터치스크린을 이용한 문자 입력 방법 및 장치
US8375329B2 (en) * 2009-09-01 2013-02-12 Maxon Computer Gmbh Method of providing a graphical user interface using a concentric menu
KR101114691B1 (ko) * 2009-10-13 2012-02-29 경북대학교 산학협력단 터치스크린 휴대단말기용 사용자 인터페이스 및 그에 의한 메뉴표시 방법
US9354726B2 (en) * 2009-11-06 2016-05-31 Bose Corporation Audio/visual device graphical user interface submenu
US20110113368A1 (en) 2009-11-06 2011-05-12 Santiago Carvajal Audio/Visual Device Graphical User Interface
US8686957B2 (en) * 2009-11-06 2014-04-01 Bose Corporation Touch-based user interface conductive rings
US20110113371A1 (en) * 2009-11-06 2011-05-12 Robert Preston Parker Touch-Based User Interface User Error Handling
US8692815B2 (en) * 2009-11-06 2014-04-08 Bose Corporation Touch-based user interface user selection accuracy enhancement
US8669949B2 (en) * 2009-11-06 2014-03-11 Bose Corporation Touch-based user interface touch sensor power
US20110109560A1 (en) * 2009-11-06 2011-05-12 Santiago Carvajal Audio/Visual Device Touch-Based User Interface
US8350820B2 (en) * 2009-11-06 2013-01-08 Bose Corporation Touch-based user interface user operation accuracy enhancement
US9201584B2 (en) 2009-11-06 2015-12-01 Bose Corporation Audio/visual device user interface with tactile feedback
US8638306B2 (en) * 2009-11-06 2014-01-28 Bose Corporation Touch-based user interface corner conductive pad
US8601394B2 (en) * 2009-11-06 2013-12-03 Bose Corporation Graphical user interface user customization
KR101717493B1 (ko) * 2010-02-12 2017-03-20 삼성전자주식회사 사용자 인터페이스 제공 방법 및 장치
WO2011099808A2 (fr) 2010-02-12 2011-08-18 Samsung Electronics Co., Ltd. Procédé et appareil permettant de fournir une interface utilisateur
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9081499B2 (en) * 2010-03-02 2015-07-14 Sony Corporation Mobile terminal device and input device
WO2011149515A1 (fr) * 2010-05-24 2011-12-01 Will John Temple Bouton multidirectionnel, touche et clavier
US20110314421A1 (en) * 2010-06-18 2011-12-22 International Business Machines Corporation Access to Touch Screens
US8756529B2 (en) 2010-09-13 2014-06-17 Kay Dirk Ullmann Method and program for menu tree visualization and navigation
KR20120033918A (ko) * 2010-09-30 2012-04-09 삼성전자주식회사 터치스크린을 구비한 휴대용 단말기의 입력 방법 및 장치
CN103168302B (zh) * 2010-10-20 2018-08-17 日本电气株式会社 数据处理终端、数据搜索方法以及存储控制程序的非瞬时计算机可读介质
US20120159383A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Customization of an immersive environment
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
CN102609098A (zh) * 2011-01-19 2012-07-25 北京三星通信技术研究有限公司 一种移动终端、移动终端的键盘及其使用方法
US20120182220A1 (en) * 2011-01-19 2012-07-19 Samsung Electronics Co., Ltd. Mobile terminal including an improved keypad for character entry and a usage method thereof
US9021397B2 (en) * 2011-03-15 2015-04-28 Oracle International Corporation Visualization and interaction with financial data using sunburst visualization
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US10275153B2 (en) * 2011-05-19 2019-04-30 Will John Temple Multidirectional button, key, and keyboard
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US20120304132A1 (en) 2011-05-27 2012-11-29 Chaitanya Dev Sareen Switching back to a previously-interacted-with application
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
KR101861318B1 (ko) * 2011-06-09 2018-05-28 삼성전자주식회사 터치 스크린을 구비한 기기의 인터페이스 제공 장치 및 방법
US9202297B1 (en) * 2011-07-12 2015-12-01 Domo, Inc. Dynamic expansion of data visualizations
US9792017B1 (en) 2011-07-12 2017-10-17 Domo, Inc. Automatic creation of drill paths
US10001898B1 (en) 2011-07-12 2018-06-19 Domo, Inc. Automated provisioning of relational information for a summary data visualization
US9026944B2 (en) 2011-07-14 2015-05-05 Microsoft Technology Licensing, Llc Managing content through actions on context based menus
US9582187B2 (en) 2011-07-14 2017-02-28 Microsoft Technology Licensing, Llc Dynamic context based menus
US9086794B2 (en) 2011-07-14 2015-07-21 Microsoft Technology Licensing, Llc Determining gestures on context based menus
US9746995B2 (en) 2011-07-14 2017-08-29 Microsoft Technology Licensing, Llc Launcher for context based menus
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US8490008B2 (en) 2011-11-10 2013-07-16 Research In Motion Limited Touchscreen keyboard predictive display and generation of a set of characters
US9715489B2 (en) 2011-11-10 2017-07-25 Blackberry Limited Displaying a prediction candidate after a typing mistake
US9310889B2 (en) 2011-11-10 2016-04-12 Blackberry Limited Touchscreen keyboard predictive display and generation of a set of characters
US9652448B2 (en) 2011-11-10 2017-05-16 Blackberry Limited Methods and systems for removing or replacing on-keyboard prediction candidates
US9122672B2 (en) 2011-11-10 2015-09-01 Blackberry Limited In-letter word prediction for virtual keyboard
US8869068B2 (en) * 2011-11-22 2014-10-21 Backplane, Inc. Content sharing application utilizing radially-distributed menus
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
US9557913B2 (en) 2012-01-19 2017-01-31 Blackberry Limited Virtual keyboard display having a ticker proximate to the virtual keyboard
EP2618248B1 (fr) 2012-01-19 2017-08-16 BlackBerry Limited Clavier virtuel fournissant une indication d'entrée reçue
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
EP2631768B1 (fr) 2012-02-24 2018-07-11 BlackBerry Limited Dispositif électronique portable incluant un affichage sensible au toucher et son procédé de contrôle
DE112012000189B4 (de) 2012-02-24 2023-06-15 Blackberry Limited Berührungsbildschirm-Tastatur zum Vorsehen von Wortvorhersagen in Partitionen der Berührungsbildschirm-Tastatur in naher Assoziation mit Kandidaten-Buchstaben
US9223497B2 (en) * 2012-03-16 2015-12-29 Blackberry Limited In-context word prediction and word correction
KR101323281B1 (ko) * 2012-04-06 2013-10-29 고려대학교 산학협력단 입력 장치 및 문자 입력 방법
US9201510B2 (en) 2012-04-16 2015-12-01 Blackberry Limited Method and device having touchscreen keyboard with visual cues
US10025487B2 (en) 2012-04-30 2018-07-17 Blackberry Limited Method and apparatus for text selection
US9354805B2 (en) 2012-04-30 2016-05-31 Blackberry Limited Method and apparatus for text selection
US9292192B2 (en) 2012-04-30 2016-03-22 Blackberry Limited Method and apparatus for text selection
US9207860B2 (en) 2012-05-25 2015-12-08 Blackberry Limited Method and apparatus for detecting a gesture
US9116552B2 (en) 2012-06-27 2015-08-25 Blackberry Limited Touchscreen keyboard providing selection of word predictions in partitions of the touchscreen keyboard
US9524290B2 (en) 2012-08-31 2016-12-20 Blackberry Limited Scoring predictions based on prediction length and typing speed
US9063653B2 (en) 2012-08-31 2015-06-23 Blackberry Limited Ranking predictions based on typing speed and typing confidence
US9195368B2 (en) 2012-09-13 2015-11-24 Google Inc. Providing radial menus with touchscreens
US9261989B2 (en) * 2012-09-13 2016-02-16 Google Inc. Interacting with radial menus for touchscreens
CN103713809B (zh) * 2012-09-29 2017-02-01 中国移动通信集团公司 一种触摸屏环形菜单动态生成方法及装置
US20140092100A1 (en) * 2012-10-02 2014-04-03 Afolio Inc. Dial Menu
USD744506S1 (en) * 2012-10-29 2015-12-01 Robert E Downing Display screen with icon for predictor computer program
USD726741S1 (en) * 2012-12-05 2015-04-14 Lg Electronics Inc. Television screen with graphical user interface
US10192238B2 (en) 2012-12-21 2019-01-29 Walmart Apollo, Llc Real-time bidding and advertising content generation
USD749606S1 (en) * 2012-12-27 2016-02-16 Lenovo (Beijing) Co., Ltd. Display screen with graphical user interface
USD716819S1 (en) * 2013-02-27 2014-11-04 Microsoft Corporation Display screen with graphical user interface
US20140281991A1 (en) * 2013-03-18 2014-09-18 Avermedia Technologies, Inc. User interface, control system, and operation method of control system
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
TWI488104B (zh) * 2013-05-16 2015-06-11 Acer Inc 電子裝置及控制電子裝置的方法
US9201589B2 (en) * 2013-05-21 2015-12-01 Georges Antoine NASRAOUI Selection and display of map data and location attribute data by touch input
USD755240S1 (en) 2013-06-09 2016-05-03 Apple Inc. Display screen or portion thereof with graphical user interface
USD819649S1 (en) 2013-06-09 2018-06-05 Apple Inc. Display screen or portion thereof with graphical user interface
USD744529S1 (en) * 2013-06-09 2015-12-01 Apple Inc. Display screen or portion thereof with icon
US20140380223A1 (en) * 2013-06-20 2014-12-25 Lsi Corporation User interface comprising radial layout soft keypad
USD746831S1 (en) 2013-09-10 2016-01-05 Apple Inc. Display screen or portion thereof with graphical user interface
USD793438S1 (en) * 2013-09-13 2017-08-01 Nikon Corporation Display screen with transitional graphical user interface
USD826271S1 (en) 2013-09-13 2018-08-21 Nikon Corporation Display screen with transitional graphical user interface
KR102206053B1 (ko) * 2013-11-18 2021-01-21 삼성전자주식회사 입력 도구에 따라 입력 모드를 변경하는 전자 장치 및 방법
GB2520700B (en) * 2013-11-27 2016-08-31 Texthelp Ltd Method and system for text input on a computing device
US10180768B1 (en) * 2014-03-19 2019-01-15 Symantec Corporation Techniques for presenting information on a graphical user interface
WO2015153890A1 (fr) 2014-04-02 2015-10-08 Hillcrest Laboratories, Inc. Systèmes et procédés pour des écrans tactiles associés à un dispositif d'affichage
KR102298602B1 (ko) 2014-04-04 2021-09-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 확장가능한 애플리케이션 표시
KR102107275B1 (ko) 2014-04-10 2020-05-06 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 컴퓨팅 디바이스에 대한 접이식 쉘 커버
WO2015154276A1 (fr) 2014-04-10 2015-10-15 Microsoft Technology Licensing, Llc Couvercle coulissant pour dispositif informatique
TWI603255B (zh) * 2014-05-05 2017-10-21 志勇無限創意有限公司 手持裝置及其輸入方法
JP1535035S (fr) * 2014-05-25 2015-10-13
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
USD753696S1 (en) 2014-09-01 2016-04-12 Apple Inc. Display screen or portion thereof with graphical user interface
USD765114S1 (en) 2014-09-02 2016-08-30 Apple Inc. Display screen or portion thereof with graphical user interface
USD753697S1 (en) 2014-09-02 2016-04-12 Apple Inc. Display screen or portion thereof with graphical user interface
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
CN106662891B (zh) 2014-10-30 2019-10-11 微软技术许可有限责任公司 多配置输入设备
USD907657S1 (en) 2015-03-30 2021-01-12 Domino's Ip Holder, Llc Pizza order display panel with a transitional graphical user interface
US9980304B2 (en) 2015-04-03 2018-05-22 Google Llc Adaptive on-demand tethering
EP3286915B1 (fr) 2015-04-23 2021-12-08 Apple Inc. Interface d'utilisateur de viseur numérique pour caméras multiples
KR101728045B1 (ko) * 2015-05-26 2017-04-18 삼성전자주식회사 의료 영상 디스플레이 장치 및 의료 영상 디스플레이 장치가 사용자 인터페이스를 제공하는 방법
USD806739S1 (en) * 2015-06-10 2018-01-02 Citibank, N.A. Display screen portion with a transitional user interface of a financial data viewer and launcher application
US10831337B2 (en) * 2016-01-05 2020-11-10 Apple Inc. Device, method, and graphical user interface for a radial menu system
USD811420S1 (en) * 2016-04-01 2018-02-27 Google Llc Display screen portion with a transitional graphical user interface component
USD804502S1 (en) 2016-06-11 2017-12-05 Apple Inc. Display screen or portion thereof with graphical user interface
US9912860B2 (en) 2016-06-12 2018-03-06 Apple Inc. User interface for camera effects
US10061435B2 (en) * 2016-12-16 2018-08-28 Nanning Fugui Precision Industrial Co., Ltd. Handheld device with one-handed input and input method
JP6311807B2 (ja) * 2017-02-03 2018-04-18 日本電気株式会社 電子機器、該電子機器に用いられる情報入力方法及び情報入力制御プログラム、並びに携帯端末装置
US10671279B2 (en) * 2017-07-11 2020-06-02 Thumba Inc. Interactive virtual keyboard configured to use gestures and having condensed characters on a plurality of keys arranged approximately radially about at least one center point
US11455094B2 (en) * 2017-07-11 2022-09-27 Thumba Inc. Interactive virtual keyboard configured for gesture based word selection and having a plurality of keys arranged approximately radially about at least one center point
CN109213403A (zh) * 2018-08-02 2019-01-15 众安信息技术服务有限公司 功能菜单操控装置及方法
USD916099S1 (en) * 2019-04-04 2021-04-13 Ansys, Inc. Electronic visual display with structure modeling tool graphical user interface
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11321904B2 (en) 2019-08-30 2022-05-03 Maxon Computer Gmbh Methods and systems for context passing between nodes in three-dimensional modeling
USD924912S1 (en) 2019-09-09 2021-07-13 Apple Inc. Display screen or portion thereof with graphical user interface
USD923021S1 (en) * 2019-09-13 2021-06-22 The Marsden Group Display screen or a portion thereof with an animated graphical user interface
USD914710S1 (en) * 2019-10-31 2021-03-30 Eli Lilly And Company Display screen with a graphical user interface
US11714928B2 (en) 2020-02-27 2023-08-01 Maxon Computer Gmbh Systems and methods for a self-adjusting node workspace
JP2023535212A (ja) * 2020-07-24 2023-08-16 アジリス アイズフリー タッチスクリーン キーボーズ エルティディ 不感帯を有する適応可能なタッチ画面のキーパッド
US11373369B2 (en) 2020-09-02 2022-06-28 Maxon Computer Gmbh Systems and methods for extraction of mesh geometry from straight skeleton for beveled shapes
USD1026014S1 (en) * 2021-09-14 2024-05-07 Bigo Technology Pte. Ltd. Display screen or portion thereof with animated graphical user interface

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474294A (en) * 1990-11-08 1995-12-12 Sandeen; Lowell Electronic apparatus and method for playing a game
US5664896A (en) * 1996-08-29 1997-09-09 Blumberg; Marvin R. Speed typing apparatus and method
EP0860765A1 (fr) * 1997-02-19 1998-08-26 Stephan Dipl.-Ing. Helmreich Dispositif et méthode d'entrée pour dispositifs de traitement de données
US5926178A (en) * 1995-06-06 1999-07-20 Silicon Graphics, Inc. Display and control of menus with radial and linear portions
WO2002088919A2 (fr) * 2001-04-27 2002-11-07 Ulrich Frey Dispositif de saisie pour un systeme informatique
EP0996883B1 (fr) * 1997-02-07 2002-11-13 Najib Chelly Procede et dispositif de saisie manuelle de symboles avec guidage
US20040104896A1 (en) * 2002-11-29 2004-06-03 Daniel Suraqui Reduced keyboards system using unistroke input and having automatic disambiguating and a recognition method using said system
DE102004031659A1 (de) * 2004-06-17 2006-06-08 Volkswagen Ag Bedienelement für ein Kraftfahrzeug
US20070079258A1 (en) * 2005-09-30 2007-04-05 Hon Hai Precision Industry Co., Ltd. Apparatus and methods of displaying a roundish-shaped menu

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3967273A (en) * 1974-03-29 1976-06-29 Bell Telephone Laboratories, Incorporated Method and apparatus for using pushbutton telephone keys for generation of alpha-numeric information
EP0498082B1 (fr) * 1991-02-01 1998-05-06 Koninklijke Philips Electronics N.V. Dispositif de traitement interactif d'objets
US5701424A (en) * 1992-07-06 1997-12-23 Microsoft Corporation Palladian menus and methods relating thereto
US5524196A (en) * 1992-12-18 1996-06-04 International Business Machines Corporation Method and system for manipulating data through a graphic user interface within a data processing system
US5596699A (en) * 1994-02-02 1997-01-21 Driskell; Stanley W. Linear-viewing/radial-selection graphic for menu display
US5543818A (en) * 1994-05-13 1996-08-06 Sony Corporation Method and apparatus for entering text using an input device having a small number of keys
US5574482A (en) * 1994-05-17 1996-11-12 Niemeier; Charles J. Method for data input on a touch-sensitive screen
US6008799A (en) * 1994-05-24 1999-12-28 Microsoft Corporation Method and system for entering data using an improved on-screen keyboard
WO1996009579A1 (fr) * 1994-09-22 1996-03-28 Izak Van Cruyningen Menus surgissant a indications directionnelles
US6295372B1 (en) * 1995-03-03 2001-09-25 Palm, Inc. Method and apparatus for handwriting input on a pen based palmtop computing device
US5721853A (en) * 1995-04-28 1998-02-24 Ast Research, Inc. Spot graphic display element with open locking and periodic animation
US5790820A (en) * 1995-06-07 1998-08-04 Vayda; Mark Radial graphical menuing system
US5745717A (en) * 1995-06-07 1998-04-28 Vayda; Mark Graphical menu providing simultaneous multiple command selection
JPH11508385A (ja) * 1996-04-19 1999-07-21 フィリップス エレクトロニクス ネムローゼ フェンノートシャップ データ処理装置
US6144378A (en) * 1997-02-11 2000-11-07 Microsoft Corporation Symbol entry system and methods
US5956035A (en) * 1997-05-15 1999-09-21 Sony Corporation Menu selection with menu stem and submenu size enlargement
US20060033724A1 (en) * 2004-07-30 2006-02-16 Apple Computer, Inc. Virtual input device placement on a touch screen user interface
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7760187B2 (en) * 2004-07-30 2010-07-20 Apple Inc. Visual expander
US7614008B2 (en) * 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US6448987B1 (en) * 1998-04-03 2002-09-10 Intertainer, Inc. Graphic user interface for a digital content delivery system using circular menus
US6271835B1 (en) * 1998-09-03 2001-08-07 Nortel Networks Limited Touch-screen input device
US6633746B1 (en) * 1998-11-16 2003-10-14 Sbc Properties, L.P. Pager with a touch-sensitive display screen and method for transmitting a message therefrom
US6507336B1 (en) * 1999-02-04 2003-01-14 Palm, Inc. Keyboard for a handheld computer
US6549219B2 (en) * 1999-04-09 2003-04-15 International Business Machines Corporation Pie menu graphical user interface
FI19992822A (fi) * 1999-12-30 2001-07-01 Nokia Mobile Phones Ltd Näppäimistöjärjestely
US6707942B1 (en) * 2000-03-01 2004-03-16 Palm Source, Inc. Method and apparatus for using pressure information for improved computer controlled handwriting recognition, data entry and user authentication
US6646633B1 (en) * 2001-01-24 2003-11-11 Palm Source, Inc. Method and system for a full screen user interface and data entry using sensors to implement handwritten glyphs
US6671170B2 (en) * 2001-02-07 2003-12-30 Palm, Inc. Miniature keyboard for a hand held computer
US6724370B2 (en) * 2001-04-12 2004-04-20 International Business Machines Corporation Touchscreen user interface
US7246329B1 (en) * 2001-05-18 2007-07-17 Autodesk, Inc. Multiple menus for use with a graphical user interface
US6683599B2 (en) * 2001-06-29 2004-01-27 Nokia Mobile Phones Ltd. Keypads style input device for electrical device
GB0116083D0 (en) * 2001-06-30 2001-08-22 Koninkl Philips Electronics Nv Text entry method and device therefor
US7036090B1 (en) * 2001-09-24 2006-04-25 Digeo, Inc. Concentric polygonal menus for a graphical user interface
US6950795B1 (en) * 2001-10-11 2005-09-27 Palm, Inc. Method and system for a recognition system having a verification recognition system
US6765556B2 (en) * 2001-11-16 2004-07-20 International Business Machines Corporation Two-key input per character text entry apparatus and method
US20050162395A1 (en) * 2002-03-22 2005-07-28 Erland Unruh Entering text into an electronic communications device
JP4160365B2 (ja) * 2002-11-07 2008-10-01 株式会社ルネサステクノロジ 高周波電力増幅用電子部品および無線通信システム
WO2005008899A1 (fr) * 2003-07-17 2005-01-27 Xrgomics Pte Ltd Procede d'entree d'un texte par choix de lettres et de mots pour des claviers et systemes de clavier reduits
US9024884B2 (en) * 2003-09-02 2015-05-05 Apple Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
US7404146B2 (en) * 2004-05-27 2008-07-22 Agere Systems Inc. Input device for portable handset
US7487147B2 (en) * 2005-07-13 2009-02-03 Sony Computer Entertainment Inc. Predictive user interface
US7574672B2 (en) * 2006-01-05 2009-08-11 Apple Inc. Text entry interface for a portable communication device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474294A (en) * 1990-11-08 1995-12-12 Sandeen; Lowell Electronic apparatus and method for playing a game
US5926178A (en) * 1995-06-06 1999-07-20 Silicon Graphics, Inc. Display and control of menus with radial and linear portions
US5664896A (en) * 1996-08-29 1997-09-09 Blumberg; Marvin R. Speed typing apparatus and method
EP0996883B1 (fr) * 1997-02-07 2002-11-13 Najib Chelly Procede et dispositif de saisie manuelle de symboles avec guidage
EP0860765A1 (fr) * 1997-02-19 1998-08-26 Stephan Dipl.-Ing. Helmreich Dispositif et méthode d'entrée pour dispositifs de traitement de données
WO2002088919A2 (fr) * 2001-04-27 2002-11-07 Ulrich Frey Dispositif de saisie pour un systeme informatique
US20040104896A1 (en) * 2002-11-29 2004-06-03 Daniel Suraqui Reduced keyboards system using unistroke input and having automatic disambiguating and a recognition method using said system
DE102004031659A1 (de) * 2004-06-17 2006-06-08 Volkswagen Ag Bedienelement für ein Kraftfahrzeug
US20070079258A1 (en) * 2005-09-30 2007-04-05 Hon Hai Precision Industry Co., Ltd. Apparatus and methods of displaying a roundish-shaped menu

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD721084S1 (en) 2012-10-15 2015-01-13 Square, Inc. Display with graphic user interface
US10503359B2 (en) 2012-11-15 2019-12-10 Quantum Interface, Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same
US10901578B2 (en) 2013-10-01 2021-01-26 Quantum Interface Llc Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same
US11221748B2 (en) 2014-06-04 2022-01-11 Quantum Interface, Llc Apparatuses for selection objects in Virtual or Augmented Reality environments
US9971492B2 (en) 2014-06-04 2018-05-15 Quantum Interface, Llc Dynamic environment for object and attribute display and interaction
WO2015188011A1 (fr) * 2014-06-04 2015-12-10 Quantum Interface, Llc. Environnement dynamique pour l'affichage et l'interaction entre objets et attributs
US11599260B2 (en) 2014-06-04 2023-03-07 Quantum Interface, Llc Apparatuses for attractive selection of objects in real, virtual, or augmented reality environments and methods implementing the apparatuses
US11886694B2 (en) 2014-06-04 2024-01-30 Quantum Interface Llc Apparatuses for controlling unmanned aerial vehicles and methods for making and using same
US11775074B2 (en) 2014-10-01 2023-10-03 Quantum Interface, Llc Apparatuses, systems, and/or interfaces for embedding selfies into or onto images captured by mobile or wearable devices and method for implementing same
US11205075B2 (en) 2018-01-10 2021-12-21 Quantum Interface, Llc Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same
US11663820B2 (en) 2018-01-10 2023-05-30 Quantum Interface Llc Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same
US11972609B2 (en) 2018-01-10 2024-04-30 Quantum Interface Llc Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same
US11226714B2 (en) 2018-03-07 2022-01-18 Quantum Interface, Llc Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects
US11550444B2 (en) 2018-03-07 2023-01-10 Quantum Interface Llc Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects

Also Published As

Publication number Publication date
TW200821904A (en) 2008-05-16
US20070256029A1 (en) 2007-11-01

Similar Documents

Publication Publication Date Title
US20070256029A1 (en) Systems And Methods For Interfacing A User With A Touch-Screen
JP7575435B2 (ja) 電子デバイス上の手書き入力
US10928993B2 (en) Device, method, and graphical user interface for manipulating workspace views
US20200192568A1 (en) Touch screen electronic device and associated user interface
US8264471B2 (en) Miniature character input mechanism
US9619139B2 (en) Device, method, and storage medium storing program
US9256366B2 (en) Systems and methods for touch-based two-stage text input
US10379626B2 (en) Portable computing device
WO2011158641A1 (fr) Terminal de traitement d'informations et procédé pour en commander le fonctionnement
US20110175826A1 (en) Automatically Displaying and Hiding an On-screen Keyboard
US20160132119A1 (en) Multidirectional button, key, and keyboard
US10387033B2 (en) Size reduction and utilization of software keyboards
WO2012101711A1 (fr) Dispositif de saisie, procédé de saisie et programme d'ordinateur
US11221756B2 (en) Data entry systems
US20130300664A1 (en) Providing a vertical candidate bar with an on-screen keyboard
WO2010089918A1 (fr) Dispositif électronique et programme de dispositif électronique
Billah et al. Accessible gesture typing for non-visual text entry on smartphones
JP5963291B2 (ja) タッチセンシティブ・スクリーンからシンボルを入力する方法および装置
US20120287048A1 (en) Data input method and apparatus for mobile terminal having touchscreen
JP5395819B2 (ja) 入力装置、入力方法及びコンピュータプログラム
US20220129146A1 (en) Method for controlling a computer device for entering a personal code
KR20130008740A (ko) 이동 단말기 및 그 제어방법
US10042543B2 (en) Indicating a word length using an input device
KR20090029551A (ko) 터치패드를 이용한 모바일 인터넷 브라우저의 네비게이션장치 및 방법
KR101207086B1 (ko) 어안효과에 기반한 터치스크린 한글 입력 방법, 장치 및 이를 이용하는 전자 기기

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07718811

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07718811

Country of ref document: EP

Kind code of ref document: A1