WO2017162493A1 - Procédé de commande pour une interface à commande tactile - Google Patents

Procédé de commande pour une interface à commande tactile Download PDF

Info

Publication number
WO2017162493A1
WO2017162493A1 PCT/EP2017/056060 EP2017056060W WO2017162493A1 WO 2017162493 A1 WO2017162493 A1 WO 2017162493A1 EP 2017056060 W EP2017056060 W EP 2017056060W WO 2017162493 A1 WO2017162493 A1 WO 2017162493A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
user
touch sensitive
sensitive interface
input
Prior art date
Application number
PCT/EP2017/056060
Other languages
English (en)
Inventor
Rufus Eugen Deodatus DRIESSEN
Daan Anton van den Ende
Jurrien Alexander BROUWER
Bjorn Weggelaar
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017162493A1 publication Critical patent/WO2017162493A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G9/00Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously
    • G05G9/02Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only
    • G05G9/04Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously
    • G05G9/047Manually-actuated control mechanisms provided with one single controlling member co-operating with two or more controlled members, e.g. selectively, simultaneously the controlling member being movable in different independent ways, movement in each individual way actuating one controlled member only in which movement in two or more ways can occur simultaneously the controlling member being movable by hand about orthogonal axes, e.g. joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • This invention relates to the field of control methods, and in particular to a control method for a touch sensitive interface.
  • Input devices for computing systems or other electronic apparatus are well known in the prior art.
  • Such input devices comprise an input interface (for example a button or light detector) so as to allow a user to interact with the computing system.
  • Known input devices include keyboards, touchpads and optical mice.
  • Touch sensitive interfaces typically comprise a touch sensitive screen having a protective layer.
  • the touch sensitive screen is adapted to detect a user's touch, as well as a location of the user's touch, using a capacitive or resistive touch sensing scheme.
  • Other touch sensitive methods will be well known by the person skilled in the art.
  • the touch sensitive interface usually comprises a display adapted to visually display an input area to a user, the input area being indicative of a functionality of the touch sensitive interface.
  • the user may touch a respective location of the touch sensitive interface (i.e. at the location of the displayed input area) so as to indicate a desired functionality. That is to say, an area of the touch sensitive interface is designated a functionality, and a user touches the area to interact with the functionality.
  • a touch sensitive interface acts as an input device for a computing system or other electronic apparatus, such as a mobile phone, automatic telling machine or a personal computer.
  • control method for a touch sensitive interface, the control method comprising: determining a touch location of a user's touch of the touch sensitive interface; selecting a portion of the touch sensitive interface, the selected portion including the touch location; defining two or more input areas of the touch sensitive interface, each input area being contained in the selected portion; and assigning an input functionality to each of the two or more input areas.
  • a control method allowing a touch of a user to define the location of areas or zones of input functionality of a touch sensitive interface.
  • the user's touch thereby defines two or more areas of input functionality of the touch sensitive interface.
  • functionalities may be assigned to a portion of the touch sensitive interface.
  • the portion of the touch sensitive interface to which the functionalities are assigned includes at least the location of the user's touch at the touch sensitive interface.
  • the assigned functionality of an input area may include inducing the touch sensitive interface to perform an action in response to a detected further user touch, a continued touch and/or the initial touch at the input area.
  • the action performed may be based on a characteristic of the initial, further and/or continued touch(es) at the input area.
  • the selection of a portion and the assigning of functionalities to defined input areas of the selected portion is only performed in response to a touch location being determined. It may therefore be understood that prior to a touch location being determined, the portion of the touch sensitive interface that is subsequently selected (in response to the touch location) is not associated with any functionality, such that there are no input areas defined within the selected portion.
  • a user's touch thereby defines a position of the selected portion, as well as the input areas and their associated functionalities.
  • a particular functionality is thereby not limited to only a single predetermined area of the touch sensitive interface, but rather may be dynamically positionable on the touch sensitive interface based on a location of the user's touch.
  • a user is not required to search and find a suitable area of functionality (e.g. an input area) of the touch sensitive interface, but may rather arbitrarily touch the touch sensitive interface and have functionalities assigned to suitable areas of the input interface based on their touch.
  • the user may touch the touch sensitive interface at a preferred position, that is a position most suitable for the user, and have functionalities automatically assigned to areas in the vicinity of at least the touched position.
  • functionalities may be assigned to the input areas of the touch sensitive interface that are in the vicinity of a location being touched by a user, thus negating a need for the user to relocate/reposition the position of their touch in order to make use of the functionalities.
  • An assigned functionality may comprise the touch sensitive interface generating an input signal in response to a characteristic of a user's touch of the selected portion of the touch sensitive interface.
  • Such an input signal may be for provision, for example, to external circuitry so as to allow for control of such external circuitry.
  • the input signal may be based on a specific characteristic of the touch at the input area.
  • the characteristic of a user's touch may comprise at least one of: a frequency of a user's touch of the input area; a number of user touches of the input area; a number of user touches of the input area in a predetermined time period; a time period between consecutive user touches of the input area; an amount of force applied by the user to the input area ; a temperature of the user's touch of the input area ; a location of at least one further touch in the input area; a movement of the user's touch of the input area; and a size of the user's touch of the input area.
  • control method may comprise removing the assigned functionality from the input area or all assigned functionalities within a selected portion.
  • the assigned functionalities can be removed from each input area of the selected portion, such that the selected portion is no longer associated with any functionality.
  • the selected portion may be considered as deselected.
  • the touch sensitive interface comprises an array of touch sensitive elements, each touch sensitive element being adapted to detect a user's touch.
  • the selecting a portion of the touch sensitive interface may comprise selecting a set of two or more touch sensitive elements, the set comprising at least two touch sensitive element that has detected a user's touch.
  • one or more of the at least two touch sensitive elements defines an input area contained by the selected portion.
  • the touch sensitive interface may be associated with a set of predetermined functionalities.
  • the assigning the functionality comprises selecting at least two functionalities from the set of predetermined functionalities and assigning a selected functionality to each input area of the selected portion of the touch sensitive interface.
  • the assigning may comprise assigning one of a set of known functionalities to a respective input area of the selected portion of the touch sensitive interface.
  • control method further comprises determining a size of the user's touch; and selecting the portion based on the determined size of the user's touch.
  • the number and/or size of input areas contained in the selected area may depend upon the determined size of the user's touch.
  • the control method may, in embodiments, comprise detecting a change in the touch location of the user's touch of the touch sensitive interface; reselecting a portion of the touch sensitive interface, the reselected portion including the changed touch location, redefining the two or more input areas, wherein the redefined input areas are contained by the reselected portion and assigning the same functionality to each redefined input area.
  • control method further comprises: detecting a further touch location of a user's further touch of the touch sensitive interface the further touch not being concurrent with the first touch; selecting a further portion of the touch sensitive interface, the selected further portion including the further touch location; defining two or more further input areas of the touch sensitive interface, each further input area being contained in the selected further portion and assigning an input functionality to each of the two or more further input areas.
  • the selected portion of the touch sensitive interface does not include the further touch location; the selected further portion of the touch sensitive interface does not include the touch location; and the selected portion of the touch sensitive interface and the selected further portion do not overlap.
  • the touch sensitive interface provides no visual indication to the user of the functionality of the two or more input areas.
  • the control method may comprise controlling the touch sensitive interface so as to show no visual indication to the user of the functionality of an input area or the location of the input areas with respect to the touch sensitive interface.
  • no visual indication is given prior to a user's touch, but a visual indication may be given after a user's touch and the portion has been selected.
  • a computer program product comprising computer program code means adapted to perform all of the steps of the control method disclosed herein when said program is run on a computing device having a processor.
  • a touch sensitive interface arrangement comprising such a computing system and a touch sensitive interface having a plurality of tactile output elements.
  • a handheld device In optional embodiments, there may be provided a handheld device
  • Embodiments may therefore be employed in devices that include touch sensitive user interfaces, such as mobile telephony devices, smart phones, tablet PCs, latops, personal digital assistants, and the like.
  • Proposed embodiments may also be employed in user input devices/arrangement or user input devices that are adapted to be contacted or manipulate by a user (for example by a user's hand) in order to provide an input.
  • embodiments may be implemented in a hand-held joystick, steering wheel, keyboard, number pad, PIN-entry device, etc.
  • Figures 1 A and IB illustrates a touch sensitive interface according to a first embodiment
  • Figure 2 schematically depicts a touch sensitive interface according to a second embodiment
  • Figure 3 illustrates a touch sensitive interface according to a third embodiment
  • Figure 4 is a schematic illustration of a touch sensitive interface according to a fourth embodiment
  • Figure 5 diagrammatically depicts a control system for a touch sensitive interface according to an embodiment
  • Figure 6 illustrates a handheld device having a touch sensitive interface according to an embodiment
  • Figure 7 illustrates a user input device having a touch sensitive interface according to an embodiment
  • Figure 8 illustrates a control method for a touch sensitive interface according to an embodiment.
  • the invention provides concepts for controlling a touch sensitive interface.
  • Proposed methods include, for example, detecting a location of a user's touch, and selecting a portion or region of the touch sensitive interface including at least the location of the user's touch. Input areas of the selected portion are defined, and a functionality can then be assigned to each input area of the selected portion, allowing a user to interact with at least the defined input areas of a selected portion of the touch sensitive interface.
  • Embodiments may therefore provide a touch sensitive interface that dynamically adapts to a user's touch, thereby catering for different user-touch positions, orientations, or sizes.
  • Illustrative embodiments may therefore provide concepts for adapting a touch sensitive interface to provide functionality at appropriate locations/positions based on the location(s)/position(s) at which a user touches the touch sensitive surface. Dynamic user- based adaptation or optimization may therefore be provided by proposed embodiments.
  • FIG. 1A illustrates a touch sensitive interface 100 according to a first embodiment of the invention.
  • the touch sensitive interface comprises a touch location sensor 120 adapted to detect a location 950 of a user's touch 900.
  • the touch sensitive interface 100 comprises an optional protective screen 110.
  • protective screens are well known in the prior art and may be formed of, for example, Si0 2 or plastics.
  • the touch location sensor 120 may be adapted to detect a location of a user's touch using any known method of touch detection so as to determine a touch location.
  • the touch location sensor 120 may be adapted to detect a user's touch using a resistive touch screen panel, a capacitive panel, surface acoustic wave technology, acoustic pulse recognition and the like. Other such techniques for detecting a location of a user's touch are well known in the prior art.
  • a touch location 950 of a user's touch 900 is determined.
  • the touch location is the location of a user's touch relative to the touch sensitive interface 100. This may be performed, for example, through use of the touch location sensor 120 adapted to detect a user's touch.
  • the user's touch 900 may be performed using a digit (e.g. finger or thumb), a user's palm, a part of the human anatomy, a stylus or any other known touching methodology.
  • a portion 130 of the touch sensitive interface 100 is selected, the selected portion including the touch location.
  • the selected portion including the touch location.
  • a portion or segment of the touch sensitive interface containing at least that touch location is selected. In this way, the touch location defines the selected portion.
  • Two or more input areas 141, 142 of the selected portion are defined. Thus two or more areas or regions contained in the selected portion of the touch sensitive interface are selected. Defining the two or more input areas may comprise selecting two or more regions included within the selected portion based on, for example, a size of the selected portion, a location of the user's touch within the selected portion, a functionality to be assigned to the input area, an identified digit of the user's touch and so on. The two or more input areas need not be of the same size and/or shape.
  • a selected portion may be divided into two or more areas or regions, each of which may be considered as a different input area of the selected portion.
  • input areas of a particular size may be positioned within a selected portion according to a predetermined pattern.
  • the predetermined pattern may be designed, for example, to simulate the normal or conventional spread of a user's fingers, such that each input area may be automatically positioned beneath a user's fingers or in predictable locations (relative to the touch location).
  • a user may touch the input interface, and have input areas automatically defined based a position of the touch location.
  • An input functionality is assigned or otherwise designated to each identified or selected input area included 141, 142 in the selected portion 130 of the touch sensitive interface 100.
  • two or more areas or regions of the touch sensitive interface, which are defined by the touch location, are assigned a functionality.
  • a first input area 141 may be assigned a first functionality
  • a second input area 142 may be defined a second, different functionality.
  • a functionality may be thought of as an action performed by the touch sensitive interface in response to a characteristic of a touch 900 at the input area of the touch sensitive interface.
  • a characteristic of a touch e.g. at least one further touch or a continued touch
  • the touch location sensor 120 may be adapted to detect a number of further touches of the user within an input area 141 of the selected portion 130 of the touch sensitive interface 100.
  • the touch sensitive interface may generate a first signal, in response to multiple further touches; the touch sensitive interface may generate a second, different signal.
  • the functionality may be one of controlling a characteristic of the touch sensitive interface or a characteristic of an electric/electronic device controlled by the touch sensitive interface. In this way, a characteristic of a touch at the selected portion may thereby define or adjust a characteristic of the touch sensitive interface and/or an associated electric/electronic device.
  • a functionality may be one of providing a certain input symbol or string to a processing device, such that an input area may act, for example, as a key on a keyboard.
  • the selected portion may be associated with a virtual keyboard, such that a plurality of input areas may be defined, each associated with a different input symbol or key.
  • location and position of two or more input areas may be defined by a touch location of a user's touch 900.
  • Functionality may thereby be dynamically provided to a user based on an arbitrarily positioned touch location of a user.
  • the location of an input functionality of the touch sensitive interface is thereby not restricted to any one location of the input interface, but may be positioned relative to the location of a user's touch.
  • a single touch e.g. a single touch of a single finger
  • the touch-sensitive interface may define a location of more than one input functionality.
  • two or more functionalities may thereby be assigned in the vicinity of the user's touch.
  • the position of the input areas relative to the touch location may be in a predetermined pattern (i.e. be predictable).
  • a user is therefore not required to search for or identify a location of input functionality, but may rather arbitrarily touch the input interface and have a functionality assigned in a predictable or intuitive manner.
  • the assigned functionality is not dependent on the touch location, such that a functionality may be assigned to an input area positioned in any portion of the touch sensitive interface without regard of where the touch is positioned relative to the touch sensitive interface.
  • the assigned functionality may be removed from the two or more input areas , for example, in response to a user action, based on a sensed characteristic of the touch, following a release of the user touch or after a predetermined period of time, for example, without detecting a user touch in the selected portion.
  • the selected portion may be subsequently or simultaneously deselected.
  • another functionality is assigned to the two or more input areas of the selected portion (e.g. a functionality may change over time, or in response to a user action).
  • the selected portion (and thereby input areas) may be relocated with respect to the touch sensitive interface, such that the location of the input areas, defining areas of functionality may be relocated. This may be due, for example, to a further touch location being detected (e.g. within or outside of the selected portion), causing the selected portion to relocate to the position of the further touch location.
  • the selected portion may be 're-centred' around the position of the further touch location.
  • the input interface 100 may track the location of a user's touch 900, such that as a user's touch 900 moves about the input interface 100, so the positioned of the input areas and their assigned functionalities move about the input interface 100.
  • a location of the user's touch, 900 even if moved, may define the position of input
  • Embodiments provide for more than one functionality to be assigned to a portion of the user interface, the portion being selected based on a location of a user's touch.
  • a selected portion may be assigned, from a plurality of assignable input functionalities, two or more input functionalities or areas of input capability.
  • more than two input areas are defined in the selected portion, for example, three or more input areas or five or more input areas.
  • Each input area is assigned a respective functionality.
  • the touch sensitive interface 100 is adapted to detect a second touch location 960 associated with a user's second, concurrent touch 910 of the touch sensitive interface 100.
  • the second touch 910 may be understood to occur at substantially the same time as the first touch, or whilst the first touch is ongoing (e.g. whilst the user continues to touch the input interface).
  • the selecting the portion of the input interface may comprise selecting a portion that includes both the touch location 950 and the second, concurrent touch location 960.
  • the size and/or shape of the selected portion may be defined by the touch location 950 and the second touch location 960.
  • the size and/or position of input areas within the selected portion is also based on the touch location 950 and the second touch location 960.
  • the configuration of a selected portion, and the distribution of the input keys within the selected portion may be dependent upon touch locations associated with more than one user's touch.
  • the distance between the two touch locations may define the dimensions, a size or a shape of the selected portion.
  • a distance between the two touch locations may define a distribution of input areas within the selected portion, and/or a size or shape of input areas within the selected portion.
  • a characteristic (e.g. pressure or temperature) of the user's touch or user's second touch may define a distribution, shape and/or size of the input areas within the selected portion. In other or further embodiments, a characteristic of the user's touch or user's second touch may define a size and/or shape of the selected portion containing the input areas.
  • touch locations may define a position of particular input areas contained in the selected portion.
  • an input area may be provided at each respective touch location.
  • a selected portion may be associated with a virtual or soft keyboard, and a first touch location may define the position of a first input area representing a 'F' key and the second touch location may define the position of a second input area representing a 'J' key of the virtual keyboard.
  • Remaining input areas associated with the soft keyboard e.g. the other key
  • a size of the input areas and/or selected portion may depend upon the distance between the input area representing the 'F' key and the input area representing the 'J' key (e.g. more proximate F and J keys may provide smaller input areas).
  • appropriately distributed input areas may be readily provided to a user.
  • additional input areas may be positioned based on an estimated location of a user's other fingers.
  • each input area of the selected portion defines a button or soft key of the input interface, such that buttons may be arbitrarily positioned about the input interface based on at least a touch location of a user's (first) touch.
  • the location of a button need not be limited to a predefined location, but may rather be positioned based on a location of a user's touch.
  • a user need not search and look for a position of a button of the input interface, but may rather touch the input interface, and have two or more buttons provided at a suitable location. This may increase the convenience and comfort of a user operating the touch sensitive interface.
  • a second portion of the touch sensitive interface 100 may be selected in response to a second touch (concurrent or not), and two or more additional input areas the selected second portion may be defined.
  • a functionality is assigned to each identified additional input area in the selected second portion.
  • a distance between a first touch location and a second touch location is below a predetermined threshold value, a single portion containing both the first touch location and the second touch location may be selected. If a distance between a first touch location and a second touch location is greater than the predetermined threshold value, two different selected portions may be selected.
  • a first touch is received more than a predetermined period of time before a second touch
  • two different portions are selected. If a second touch is received less than predetermined period of time after a first touch, a single portion may be selected.
  • the pressure threshold may be determined by an absolute pressure in one or both touch locations or a relative pressure between the two touch locations.
  • a pressure of the first and/or second touch is greater than a predetermined amount, then two different portions may be selected.
  • the second selected portion does not overlap with the (initial) selected portion 130, such that there may be considered to be two separate selected portions of the touch sensitive interface, each having its own plurality of input areas associated with a respective plurality of assigned functionalities.
  • the functionalities associated with the input areas of a respective selected portion is different to the functionalities associated with the input areas of another respective selected portion. In this way, a user may control more than one plurality of functionalities.
  • the touch sensitive interface may be adapted to detect any number of further touch locations, and select a respective number of further portions.
  • Respective input areas may be defined in each further selected portion, and may be assigned their own respective functionalities, which is preferably unique to that selected portion.
  • FIG. 2 illustrates a touch sensitive interface 200 according to a second embodiment of the invention.
  • the touch sensitive interface comprises a touch location sensor 120 and a plurality of touch characteristic sensors 240.
  • the touch sensitive interface 200 comprises a protective screen 110.
  • the touch location sensor 120 may operate in the same manner as described with reference to Figures 1A and B, and shall not be repeated here for the purposes of brevity and conciseness.
  • the selected portion 130 of the touch sensitive element may comprise at least one touch characteristic sensor 240 adapted to detect a characteristic of the user's touch 900.
  • the touch characteristic sensor may comprise a thermometer adapted to detect a temperature of a user's touch, or a pressure sensor adapted to detect a pressure applied by a user in touching the touch sensitive interface.
  • selecting the two or more input areas of a portion of the touch sensitive interface may comprise selecting two or more touch characteristic sensors 240 associated with the touch location 950 of the user's touch 900.
  • selecting two or more touch characteristic sensors 240 associated with the touch location 950 of the user's touch 900 For the purposes of explanation, only a single touch characteristic sensor 240, corresponding to a single input area is illustrated.
  • a touch characteristic sensor 240 corresponding to a touch location may be assigned a functionality. This allows the touch sensitive interface to perform an action based on at least a touch characteristic sensed by the touch characteristic sensor.
  • the touch characteristic sensor 240 is adapted to detect a pressure applied by the user touch.
  • the touch sensitive interface may be adapted to generate signals of varying magnitude.
  • the touch location sensor and the touch characteristic sensor may be used independently or in combination to detect a number of different characteristics of a user's touch.
  • the touch location sensor may be used to detect a frequency of a user's touch (e.g. how often the touch location sensor senses a touch in a particular input area).
  • a number of user touches of the input area for example a cumulative number of user touches; a number of user touches of the input area in a predetermined time period; a time period between consecutive user touches of the input area; an amount of force applied by the user to the input area; a temperature of the user's touch of the input area; a location of further touches in the input area; a movement of the user's touch of the input area; and a size of the user's touch of the input area.
  • an action performed (or an action to be performed) by the touch sensitive interface may be determined based on one or more such detected
  • FIG. 3 schematically depicts a touch sensitive interface 300 according to a third embodiment of the invention.
  • the touch sensitive interface comprises a touch location sensor 120 adapted to detect a location 950 of a user's touch 900 and an optional protective screen 110.
  • the touch location sensor 120 is formed of a plurality of touch sensitive elements 121 formed in an array. Each touch sensitive element is adapted to detect a user's touch.
  • the touch location 950 is defined by which at least one touch sensitive element 121a detects or responds to the user's touch, such that the position of the user's touch is determined with respect to the array of touch sensitive elements.
  • the touch sensitive element(s) 121a which detect(s) a user's touch, defines the position of the touch location within the touch sensitive interface 120.
  • the array of touch sensitive elements 121 forming the touch location sensor 120 may be a single row of touch sensitive elements (one-dimensional), a plurality of rows of touch sensitive elements forming a matrix of touch sensitive elements (two- dimensional) or even distributed across three dimensions (three-dimensional).
  • the touch sensitive elements may be evenly distributed with respect to the array (i.e. there is a same distance between elements) or be distributed in a predetermined pattern (i.e. there need not be a same distance between touch sensitive elements).
  • each touch sensitive element 121 may comprise a respective capacitive or piezoresistive sensor, such that the touch location sensor comprises an array of capacitive or piezoresistive sensors.
  • Other touch sensitive elements 121 would be readily apparent to the skilled person.
  • each touch sensitive element 121 is an
  • EAP electroactive polymer
  • Such EAPs may be adapted to simultaneously act as a touch characteristic sensor, for example, detecting a level of pressure applied by the user to the touch sensitive interface (i.e. at the touch sensitive element 121).
  • the touch sensitive element may simultaneously act as a touch characteristic sensor.
  • the selecting of the portion of the touch sensitive interface may comprise selecting at least the touch sensitive elements that detected a user's touch. In such an embodiment, it will be readily apparent that the selected portion will comprise the touch location 950 of the user's touch 900 (as the touch location may be defined by the touch sensitive element that detected the touch).
  • the selecting of the portion of the touch sensitive interface comprises selecting the touch sensitive element 121a that detected a user's touch and at least adjacent touch sensitive elements 121b, 121c.
  • Different touch sensitive elements may, for example, be associated with different input areas in the selected portion.
  • the selecting of the portion of the touch sensitive interface comprises selecting a plurality of touch sensitive elements 121, and associating different touch sensitive elements 121 with different input areas.
  • the selected plurality of touch sensitive elements may form a predetermined pattern.
  • the predetermined pattern may form any shape or selection of touch sensitive elements in the array.
  • the pattern may be substantially circular, such that selected portion may radiate out from the touch location.
  • the pattern is substantially rectangular or square.
  • the pattern comprises a plurality of touch sensitive elements selected from a single row of touch sensitive elements in the array
  • the touch sensitive interface may comprise an array of touch sensitive elements 121 adapted to detect a user's touch.
  • a portion 130 of the touch sensitive interface is selected comprising at least a set of touch sensitive elements 121a, 121b, 121c.
  • the set of touch sensitive elements comprises at least the touch sensitive element(s) that detected a user's touch; in the scenario depicted in Figure 3, this is touch sensitive element 121a.
  • Each touch sensitive element 121 may be associated with a different input area of the selected portion.
  • Figure 4 schematically depicts a fourth embodiment of a touch sensitive interface 400 according to an embodiment of the invention.
  • the touch sensitive interface 400 comprises the same elements as the touch sensitive interface described with reference to Figure 3, namely an array of touch sensitive elements 121 adapted to detect a touch location 950 of a user's touch 900 and an optional protective screen 110.
  • Each touch sensitive element 121 is formed of a touch sensor 122 and a touch characteristic sensor 123.
  • the touch sensor 122 is adapted to detect a user's touch.
  • the touch characteristic sensor is adapted to sense a characteristic of a user's touch 900 at the touch sensitive element 121.
  • the touch characteristic sensor 123 may be adapted to detect a temperature, a pressure and/or a movement (e.g. whether the touch is moved).
  • the touch sensitive interface 400 may comprise additional characteristic sensors which are shared between two or more touch sensitive elements.
  • a shared characteristic element 124 may be shared between at least two touch sensitive elements.
  • a shared characteristic element 124 allows for sensors which may be larger than the footprint of a single touch sensitive element to be provided, and sense a characteristic of a user touch. This may allow for characteristic elements having a higher degree of accuracy to sense touch characteristics attributable to a user's touch 900.
  • selected portion of the touch sensitive interface 400 comprises any additional characteristic sensors associated with a touch sensitive element detecting the user's touch 900.
  • the additional characteristic sensors 124 may be associated with an area (or volume) of the touch sensitive interface in which the touch location 950 is positioned.
  • each defined input area of the selected portion of the touch sensitive interface may comprise, for example, at least one touch sensitive element and any additional characteristic sensors associated with the at least one touch sensitive element.
  • FIG. 5 schematically illustrates a control system 500 for a touch sensitive interface according to an embodiment of the invention.
  • the touch sensitive interface 120 comprises an array of touch sensitive elements 121 each adapted to output a signal.
  • a touch location determiner 510 is adapted to receive signals output by each touch sensitive element 121 and determine whether a touch sensitive element is receiving a user's touch, and where the user's touch is located so as to determine a touch location.
  • the touch location determiner 510 may determine which touch sensitive element 121 is sensing a user's touch and, based on a known position of that touch sensitive element in the array, determine a touch location.
  • the touch sensitive interface 120 also comprises a plurality of characteristic elements 124 adapted to detect a value of a characteristic of the user's touch (e.g. "a touch characteristic"). It may be understood that each characteristic element detects a characteristic at a respective location of the touch sensitive interface. In other words, each characteristic element generates a signal indicative of at least a value of a touch characteristic associated with the respective characteristic element.
  • a touch characteristic e.g. "a touch characteristic”
  • the control system 500 comprises a portion selector 520 and a multiplexor arrangement 530.
  • the portion selector is adapted to determine a portion of the touch sensitive interface to be selected based on the received signals from the touch sensitive elements, and is further adapted to identify at least two input areas contained by the selected portion. Based on the determined portion, the portion selector generates at least one control signal indicative of at least information associated with the selected portion and the at least two input areas. For example, the portion selector may generate a first control signal 525a indicative of information of a first input area, and a second control signal 525b indicative of information of a second input area.
  • the multiplexor arrangement 530 is adapted to receive an input signal from each characteristic element 124, each input signal representing a detected characteristic of a user's touch by the respective characteristic element 124.
  • the multiplexor arrangement 530 selects which input signals are to be passed or forwarded (by the multiplexor arrangement 530) based on the control signal 525.
  • the multiplexor arrangement 530 selects input signals representing a characteristic of a user's touch within a particular input area of the selected portion.
  • the multiplexor arrangement 530 is adapted to output at least two multiplexed signals 535a, 535b the multiplexed signals 535a, 535b representing input signals associated with the two or more input areas of the selected portion.
  • control signals 525a, 525b may be indicative of which characteristic elements 124 are in which input area of the selected portion, such that the multiplexor forwards signals associated with the indicated characteristic elements.
  • control system 500 selects a portion of the touch sensitive interface 100 based on a received touch location, and identifies two or more input areas within the selected portion. Based on the two or more input areas , a multiplexor arrangement 530 forwards two or more input signals (indicative of a value of a touch characteristic within each input area), such that input signals associated with the two or more input areas of the selected portion are forwarded.
  • a processing unit 540 of the control system 500 receives the forwarded input signals (i.e. the multiplexed signals 535a, 535b), and determines an action to be performed based on the forwarded input signals.
  • values of a touch characteristic detected by characteristic elements 124 in the selected portion are used in the determination of an action to be performed.
  • the determining of an action to be performed based on at least a characteristic of a touch in the selected portion provides a functionality to each input area of the selected portion.
  • each touch sensitive element 121 in the array of touch sensitive elements 120 provides an input signal to the multiplexor arrangement 530, such that the multiplexor arrangement 530 may further forward signals output by the touch sensitive elements 121.
  • Such signals may, for example, be forwarded to the input signal generator 540, such that the input signal may be generated further on the basis of signals received from the touch sensitive elements 121.
  • information associated with a presence of a touch in the input areas of the selected portion may be forwarded by the multiplexor arrangement 530.
  • the multiplexor arrangement 530 does not receive input from the characteristic elements 124, but rather only receives input from the array of touch sensitive elements 121. In such embodiments, only received inputs associated with the identified input areas in the selected portion of the touch sensitive arrangement are forwarded by the multiplexor arrangement 530.
  • the control system 500 thereby generates an input for a processing unit 540, where the processing unit 540 determines an action to be performed based on the generated input.
  • the input comprises at least multiplexed signal 535, which is representative of at least one detected touch characteristic touch within an input area of a selected portion of a touch sensitive interface.
  • the detected touch characteristic may be based on a presence of a user's touch within the touch location (e.g. a frequency of a user's touch or a time between consecutive user touches).
  • control system may be formed in a single processor, such that each element of the control system represents an action performed by the single processor or a collection of instructions in computer programmable code.
  • control system 500 may be readily provided to any known touch sensitive interface.
  • the size and/or shape of the selected portion 130 and/or input areas may be determined based on a characteristic of the initial touch 900.
  • the touch sensitive interface 100 and/or control system 500 may be adapted to detect a size of a user's touch 900 at the touch location 950 and select a portion 130 based on the detected size, such that the size and/or shape of the selected portion is dependent upon the size of the user's touch at the touch location.
  • the selected portion preferably does not comprise the entirety of the touch sensitive interface, such that only a part, section or segment of the touch sensitive interface is selected as a portion.
  • the two or more input areas need not span the entirety of the selected portion, but may instead each comprise only a portion or section of the selected portion.
  • the selected portion 130 is dynamic such that the position and/or size of the selected portion 120 (relative to the touch sensitive interface) may change over time.
  • the change in the selected portion may, for example, be due to a detected change in the user's touch location, such that the selected portion may follow a movement in a user's touch.
  • the size, shape, and/or position of the input areas of the selected portion may change accordingly.
  • each input area remains in a same location relative to the selected portion if a position of the selected portion changes position relative to the input interface.
  • the present invention allows for functionalities to not be limited to only a single predetermined area of the touch sensitive interface, but rather may be dynamically positioned on the touch sensitive interface based on a location of the user's touch.
  • a user is not required to search and find a suitable area of functionality (e.g. an input area) of the touch sensitive interface, but rather functionality may be assigned to input areas of a portion selected based on a user's touch in any arbitrary location of the interface.
  • the touch sensitive interface 100 and/or control system 500 is adapted to determine a size and/or shape of the user's touch 900. This may, for example, be determined based on a number of touch sensitive elements 121 of a touch sensitive array respond to a user's touch. In other embodiments, this may be performed by a dedicated touch size sensor (not shown) adapted to determine a size of a user's touch.
  • the touch sensitive interface 100 and/or control system 500 may be adapted to perform the selecting a portion based on the detected size and/or shape of the user's touch. In further embodiments, the touch sensitive interface 100 and/or control system 500 may be adapted to select the two or more input areas of the selected portion based on the detected size and/or shape of the user's touch.
  • the touch sensitive interface 100 and/or control system 500 is adapted to predict which digit (e.g. finger or thumb) of a user performed a touch. This may, for example, be performed by comparing a size of a touch to a set of predetermined sizes (each associated with a known digit). In other examples, this is performed by comparing a shape of a touch to a set of predetermined shapes (each associated with a respective known digit).
  • an assigned functionality may be a selected functionality from a set of predetermined functionalities.
  • the touch sensitive interface 100 and/or control system 500 may be associated with a plurality of functionalities from which a functionality is selected and assigned to the each of a plurality of input areas of a selected portion.
  • the functionality may be assigned based on an order of touch, for example, a first input area associated with a first touch is assigned the first functionality in an ordered set of functionalities whereas a second input area associated with a second touch is assigned the second functionality in an ordered set of functionalities.
  • functionalities are assigned based on a characteristic of the touch (e.g. a shape, a determined digit, an initial pressure, an initial temperature and so on).
  • a control system may determine which functionality (from a set of functionalities) to assign to an input are based on the identified characteristic.
  • the functionalities assigned to the input areas may be a first set of functionalities, whereas if it is determined that the first touch is of a second (e.g. greater) pressure, the functionalities assigned to the input areas may be a second, different set of functionalities.
  • the assigned functionalities are based on a characteristic of the touch sensitive interface or other connected apparatus (e.g. an electronic device for which the touch sensitive interface is an input).
  • the assigned functionalities are assigned based on other criteria such as a time, a battery level, a weather condition, traffic information, environmental factors and so on.
  • the touch sensitive interface 200 is an input interface for further apparatus such as an electric/electronic device (e.g. a mobile phone or an electric shaver).
  • the functionalities assigned to the input areas of the selected portion may each be one of adjusting a characteristic of such an electric/electronic device (for example, adjusting a volume of a mobile phone or a vibrating frequency of an electric shaver). Characteristics of the electric/electronic device may be adjusted based on characteristics of respective touches at the two or more input areas of the selected portion (e.g. number of touches).
  • a "double-touch" i.e.
  • a "triple- touch" i.e. three touches received within a predetermined time period
  • a "triple- touch" i.e. three touches received within a predetermined time period
  • a touch sensitive interface 100 may operate as an input interface or input device for further circuitry or additional apparatus.
  • the additional apparatus may comprise one or more of: a mobile phone; an electric shaver; automobile/automotive computers; a personal computer; a laptop; domestic electronic appliances; personal care product; personal hygiene electronic goods; cameras; video cameras and so on.
  • a touch sensitive interface may be used as an input device will be readily apparent to the person skilled in the art.
  • the assigned functionalities may each be one of altering or defining a characteristic of such further circuitry or additional apparatus.
  • the selected portion does not comprise the entirety of the touch sensitive interface, such that only a part, section or segment of the touch sensitive interface is selected as a portion.
  • the touch sensitive interface may be a touch sensitive interface 600 of a handheld device, such as a mobile or cellular phone 6.
  • a portion 630 of the touch sensitive interface may be selected based on a touch location 650 of a user's touch.
  • a shape or pattern of the portion 630 is elongate, such that the portion extends above and below the touch location 650.
  • the functionalities assigned to input areas of the selected portion 630 of the touch sensitive interface 600 may be one of altering a volume level of the mobile device (i.e. a characteristic of the mobile device).
  • the volume level may be altered depending upon a location of where the user touches within the selected portion 630, that is, depending upon which input area a user touches within the selected portion.
  • the volume level may be increased.
  • the volume level may be decreased.
  • the assigned functionalities to the input areas of the selected portion may be removed, such that the selected portion is longer associated with the functionalities, and/or the selected portion may be deselected.
  • the assigned functionalities may be removed and/or the selected portion may be deselected after a predetermined period of time has elapsed without detecting user's touch (e.g. after two seconds since a user last touched the selected portion).
  • the selected portion may be deselected if a user touches a predetermined position of the touch sensitive interface, or based on a characteristic of the user touch (e.g. detecting a touch for more than 5 seconds causes the assigned functionalities to be removed and/or selected portion to be deselected).
  • a user may perform a second touch (using a different finger for example) of the touch sensitive interface.
  • the second touch may be made together with the first touch, whilst the selected portion is selected (e.g. active and responding to a characteristic of a user's touch).
  • a second touch location 680 associated with the user's second touch may be determined, and a second portion 670 of the touch sensitive interface may be selected.
  • a respective functionality may be assigned to each defined input areas of the second portion.
  • assigned functionalities may each define a particular brightness of the mobile device 6 (e.g. a brightness of the touch sensitive interface 600).
  • a plurality of input areas may be circularly arranged around the touch location 680, each input area being associated with a defined brightness of the touch sensitive interface.
  • a brightness of the touch sensitive interface may be adjusted depending upon location of a user's touch within the selected portion about the touch location (e.g. an angle from the vertical) a brightness of the touch sensitive interface may be adjusted.
  • the input areas of a selected portion may be configured to substantially simulate or replicate a slider or continuous input adjuster.
  • the brightness may be altered based on a pressure applied to a user at an input area of the second selected portion 680. For example, if a user applies a large pressure, brightness may be increased, if a user applies a lower pressure, brightness may be decreased.
  • Other input areas of the second select portion may be associated with different characteristics of the input interface, for example, a temperature or color of light output by the mobile device 6.
  • the touch sensitive interface is adapted to determine a second touch location 680 of a user's second touch of the touch sensitive interface.
  • the second touch should be understood to be an additional touch, an auxiliary touch or a supplementary touch, such that the user's touch and the user's second touch may be performed simultaneously at different locations of the touch sensitive interface, as previously described.
  • a second portion may be selected based on the second touch location.
  • a second portion may be selected based on the second touch location.
  • only a single portion is selected based on a (first) touch location and a second touch location.
  • the size and/or shape of the single portion may be based on the first and/or second touch location, and/or characteristics of the first and/or second touch.
  • the control system may be adapted to select a second portion 670 of the touch sensitive interface 600 (i.e. an additional portion of the touch sensitive interface), the second portion including the second touch location 680.
  • Input areas are defined within the second portion.
  • a second set of functionalities preferably different to a first set of functionalities assigned to input areas of a first portion, may be assigned to the input areas of the second portion 670 of the touch sensitive interface.
  • the touch sensitive interface 700 may be an input interface for a user input device 7, for example, a joystick.
  • the input device 7 comprises a handle 70 around which the touch sensitive interface is mounted.
  • the touch sensitive interface 700 may be three-dimensional, such that it may be present in more than two dimensions.
  • Input areas contained or delimited by a selected portion of the touch sensitive interface 700 are each assigned a functionality, where the functionality may be one of generating an input signal to be provided by the input device 7.
  • an input signal e.g. for a computer or aircraft computing system
  • the user's touch detected by the touch sensitive interface is the touch of a user's palm.
  • the user may grip or wrap around the handle upon which the touch sensitive interface is provided, and input areas are identified are defined based on the positioned of the user's palm.
  • a user may be provided with functionalities at input areas defined at positions of the touch sensitive interface associated with their fingers (based on a determined location of the palm).
  • Other embodiments of input devices having a touch sensitive interface according to an aspect of the invention will be readily apparent to the skilled person.
  • an input device may be a steering wheel of an automobile, where the assigned functionality of each input area of the selected portion is one of controlling a parameter of the car (e.g. gear, speed and so on).
  • the input device is a handle for a personal care product (e.g. a hairbrush, a razor, an epilator, a toothbrush and so on), where the assigned functionality of each input area the selected portion is one of controlling a parameter of the personal grooming product (e.g. a vibrating frequency, vibrating intensity and so on).
  • a personal care product e.g. a hairbrush, a razor, an epilator, a toothbrush and so on
  • a parameter of the personal grooming product e.g. a vibrating frequency, vibrating intensity and so on
  • Figure 8 illustrates a control method 8 for a touch sensitive interface according to an embodiment of the invention.
  • the control method 8 comprises: determining 810 a first touch location of a user's first touch of the touch sensitive interface; selecting 820 a portion of the touch sensitive interface, the selected portion including the first touch location; defining 830 two or more input areas of the touch sensitive interface, each input area being contained in the selected portion and assigning 840 a functionality to each of the two or more input areas.
  • a computer program product comprising computer program code means adapted to perform all of the following steps when said program is run on a computing device having a processor: determining a touch location of a user's touch of a touch sensitive interface; selecting a portion of the touch sensitive interface, the selected portion including the touch location; defining two or more input areas of the touch sensitive interface, each input area being contained in the selected portion and assigning a functionality to each of the two or more input areas .
  • the touch sensitive interface may comprise a plurality of tactile output elements.
  • the control method may further include generating a control signal based on a user action, the control signal being adapted to control a pattern of an output characteristic of at least one tactile output element included in the selected portion.
  • the control signal may be adapted to control a temporal pattern of a single tactile output element in the selected portion, such that a characteristic of a tactile output of the tactile output element changes over time.
  • the control signal may be adapted to control a spatial pattern of a plurality of tactile output elements in the selected portion. In other words, the control signal may control which two or more tactile output elements in a selected portion are active at a given moment.
  • the control method may comprise detecting a characteristic of a user action; generating the control signal for the tactile output elements based on the detected
  • the generating the control signal may comprise comparing the detected characteristic to a set of predetermined characteristic values; and generating the control signal based on a result of the comparison.
  • the tactile output elements may be adapted to provide feedback about a particular user characteristic to a user.
  • the user characteristic may relate to, for example, a characteristic of the user's touch or other characteristic of the user (e.g. how a user holds the input interface or connected device).
  • a touch sensitive interface may determine a characteristic of a user's touch and generate a control signal based on the characteristic, the control signal controlling an output of at least one tactile output element.
  • Possible concepts may comprise a touch sensitive interface adapted to provide feedback about a user characteristic, without providing input areas.
  • a touch sensitive interface having a plurality of tactile output elements may be positioned on or in a handle of a toothbrush.
  • a toothbrush may comprise a user action sensor adapted to detect a user-caused movement of the toothbrush.
  • a detected movement of a toothbrush may be compared to a set of known movements, and a most similar known movement (to the detected movement) may be selected.
  • a pattern of an output characteristic of the plurality of tactile output elements may be controlled. In this way, a correct or incorrect movement may be indicated to the user (i.e. the selected known movement is associated with a correct or incorrect movement) via a tactile feedback.
  • a movement of a device coupled to an input interface according to a possible concept may be detected, and corresponding feedback may be provided to a user through a plurality of tactile output elements.
  • the touch sensitive interface 100 comprises a visual display (not shown) adapted to provide a visual output to a user.
  • the visual display provides no visual indication of the functionality of the input areas of the selected portion to the user.
  • the visual display provides no indication of a functionality to a user prior to a touch location being determined, or a user touch being detected.

Abstract

La présente invention concerne un procédé de commande d'une interface à commande tactile (100). Le procédé consiste à détecter un emplacement (950) du toucher (900) d'un utilisateur et à sélectionner une partie (130) de l'interface à commande tactile comprenant au moins l'emplacement du toucher de l'utilisateur. Des zones d'entrée (141, 142) de la partie sélectionnée sont définies et une fonctionnalité est attribuée à chaque zone d'entrée contenue par la partie sélectionnée, ce qui permet à un utilisateur d'interagir avec au moins la partie sélectionnée de l'interface à commande tactile.
PCT/EP2017/056060 2016-03-23 2017-03-15 Procédé de commande pour une interface à commande tactile WO2017162493A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
EP16162034.9 2016-03-23
EP16162023 2016-03-23
EP16162034 2016-03-23
EP16162023.2 2016-03-23

Publications (1)

Publication Number Publication Date
WO2017162493A1 true WO2017162493A1 (fr) 2017-09-28

Family

ID=58360978

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/056060 WO2017162493A1 (fr) 2016-03-23 2017-03-15 Procédé de commande pour une interface à commande tactile

Country Status (1)

Country Link
WO (1) WO2017162493A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090195959A1 (en) * 2008-01-31 2009-08-06 Research In Motion Limited Electronic device and method for controlling same
US20110115711A1 (en) * 2009-11-19 2011-05-19 Suwinto Gunawan Method and Apparatus for Replicating Physical Key Function with Soft Keys in an Electronic Device
DE202012101741U1 (de) * 2011-12-19 2012-05-29 Atmel Corporation Berührungssensorgerät mit mehreren Oberflächen und einer Detektion von Benutzeraktivitäten
US20120188174A1 (en) * 2011-01-24 2012-07-26 Migos Charles J Device, Method, and Graphical User Interface for Navigating and Annotating an Electronic Document
US20130149964A1 (en) * 2011-12-07 2013-06-13 At&T Intellectual Property I, L.P. Extending the Functionality of a Mobile Device
US20160357412A1 (en) * 2015-06-07 2016-12-08 Apple Inc. Device, Method, and Graphical User Interface for Providing and Interacting with a Virtual Drawing Aid

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090195959A1 (en) * 2008-01-31 2009-08-06 Research In Motion Limited Electronic device and method for controlling same
US20110115711A1 (en) * 2009-11-19 2011-05-19 Suwinto Gunawan Method and Apparatus for Replicating Physical Key Function with Soft Keys in an Electronic Device
US20120188174A1 (en) * 2011-01-24 2012-07-26 Migos Charles J Device, Method, and Graphical User Interface for Navigating and Annotating an Electronic Document
US20130149964A1 (en) * 2011-12-07 2013-06-13 At&T Intellectual Property I, L.P. Extending the Functionality of a Mobile Device
DE202012101741U1 (de) * 2011-12-19 2012-05-29 Atmel Corporation Berührungssensorgerät mit mehreren Oberflächen und einer Detektion von Benutzeraktivitäten
US20160357412A1 (en) * 2015-06-07 2016-12-08 Apple Inc. Device, Method, and Graphical User Interface for Providing and Interacting with a Virtual Drawing Aid

Similar Documents

Publication Publication Date Title
Whitmire et al. Digitouch: Reconfigurable thumb-to-finger input and text entry on head-mounted displays
KR101535320B1 (ko) 표면상에 놓인 손에 맞는 제스쳐를 생성하는 방법
Cheng et al. iGrasp: grasp-based adaptive keyboard for mobile devices
US8830181B1 (en) Gesture recognition system for a touch-sensing surface
US8188985B2 (en) Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
JP6249316B2 (ja) 情報処理の方法、装置、およびデバイス
US7352365B2 (en) Flexible computer input
EP2817693B1 (fr) Dispositif de reconnaissance de gestes
US8144129B2 (en) Flexible touch sensing circuits
US10296091B2 (en) Contextual pressure sensing haptic responses
US5808605A (en) Virtual pointing device for touchscreens
US20110248927A1 (en) Multi-mode touchscreen user interface for a multi-state touchscreen device
US20110109577A1 (en) Method and apparatus with proximity touch detection
US20120068946A1 (en) Touch display device and control method thereof
GB2470654A (en) Data input on a virtual device using a set of objects.
US20110248946A1 (en) Multi-mode prosthetic device to facilitate multi-state touch screen detection
EP2474890A1 (fr) Configuration de clavier virtuel par positionnement des doigts au repos sur un écran multi-tactile, calibrant ainsi la position des touches
US20120075202A1 (en) Extending the touchable area of a touch screen beyond the borders of the screen
KR20120016015A (ko) 디스플레이 장치 및 그의 오브젝트 이동 방법
KR20100121183A (ko) 휴대용 단말기 및 그 사용자 인터페이스 제어 방법
KR20160098752A (ko) 디스플레이 장치 및 디스플레이 방법 및 컴퓨터 판독가능 기록매체
US20140210739A1 (en) Operation receiver
WO2017162493A1 (fr) Procédé de commande pour une interface à commande tactile
KR20230010734A (ko) 터치 감응형 조작면이 있는 조작유닛
CN102455848A (zh) 荧幕触控式键盘的输入控制方法及电子装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17712058

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17712058

Country of ref document: EP

Kind code of ref document: A1