WO2012098361A1 - Appareil et procédé pour interaction d'utilisateur améliorée dans des dispositifs électroniques - Google Patents
Appareil et procédé pour interaction d'utilisateur améliorée dans des dispositifs électroniques Download PDFInfo
- Publication number
- WO2012098361A1 WO2012098361A1 PCT/GB2012/000054 GB2012000054W WO2012098361A1 WO 2012098361 A1 WO2012098361 A1 WO 2012098361A1 GB 2012000054 W GB2012000054 W GB 2012000054W WO 2012098361 A1 WO2012098361 A1 WO 2012098361A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- gesture
- touch sensitive
- gesture input
- boundary
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to apparatus, methods and computer programs providing improved user interaction with portable electronic devices.
- touch sensitive areas that accept user input and this provides ease of use by enabling intuitive user interaction with the device.
- Touch-sensitive areas are often implemented to allow users to select an item by finger touch or other similar input such as a stylus, and can initiate an operation by gestures, avoiding the need for a separate data input apparatus such as a mouse or trackpad.
- Various technologies have been used to implement touch-sensitivity, including capacitive touch sensors which measure a change in capacitance resulting from the effect of touching the screen, resistive touch sensitive areas that measure a change in electrical current resulting from pressure on the touch sensitive areas reducing the gap between conductive layers, and other technologies.
- touch sensitive areas on a phone display screen which displays graphical user interface objects such as virtual buttons that may change depending on the application that is running on the phone.
- the functionality associated with the virtual buttons can be invoked through direct manipulation of the buttons by performing a certain type of gesture on the location of the touch sensitive area where the virtual button is located.
- One type of gesture is a user tapping the location of the virtual button on the screen thereby activating the virtual button.
- many current touch sensitive areas are capable of accepting and recognizing particular gestures in addition to a tap gesture.
- WO 2009137419 relates to a touch-sensitive display screen that is enhanced by a touch-sensitive control area that extends beyond the edges of the display screen.
- the touch-sensitive area outside the display screen referred to as a "gesture area”
- the present invention provides an electronic device for receiving gesture input, the device comprising: a first touch sensitive region for receiving user gesture input and a second touch sensitive region for receiving user gesture input; a processor for interpreting the user gesture input; a boundary region located between the first touch sensitive region and second touch sensitive region, wherein if a start of the gesture input is inside the first or second touch sensitive region and the completion of the gesture input is inside the boundary region, the processor is adapted to interpret that the gesture has been completed in the first or second touch sensitive region from which the gesture entered the boundary region, and if the start of the gesture input is inside the boundary region and the completion of the gesture input is inside the first or second touch sensitive region, the processor is adapted to interpret that the gesture input has started in the first or second touch sensitive region in which the gesture enters after exiting the boundary region.
- the processor is adapted to interpret that the gesture started in the first touch sensitive region and completed in the second touch sensitive region, and if the start of the gesture input inside the second touch sensitive region, crosses the boundary region, and is completed in the first touch sensitive region, the processor is adapted to interpret that the gesture started in the second touch sensitive region and completed in the first touch sensitive region.
- the first touch sensitive region may have a different function to the second touch sensitive region.
- the boundary region may be a variable number of pixels and based on the gesture input to the first or second touch sensitive region.
- the boundary region has a height that is a predetermined number of pixels.
- the boundary region may have a first boundary threshold and a second boundary threshold, and the processor can be adapted to determine where the gesture input is to be regarded as occurring on the basis of which boundary threshold is crossed by the gesture input.
- the first touch sensitive region is a display screen.
- the second touch sensitive region may be adapted to receive gesture inputs which are interpreted by the processor to carry out commands on the device that are different to if the same gesture inputs are performed in the first touch sensitive region.
- the second touch sensitive region may have a number of hard coded buttons that perform predetermined functions when they are activated by performing a button tap gesture but the second sensitive region is also able to detect a number of other gestures from a user.
- the buttons may be touch sensitive or physical buttons.
- the present invention provides a method of interpreting gesture input from a user of an electronic device, the method comprising: detecting a gesture input in a first or second touch sensitive region of an electronic device; determining if the gesture input starts or ends in a boundary region that is located between the first or second touch sensitive region, wherein if a start of the gesture input is inside the first or second region and the completion of the gesture input is inside the boundary region, the gesture input is interpreted as being completed in the first region if the gesture input entered the boundary region from the first region, and is interpreted as being completed in the second region if the gesture input entered the boundary region from the second region, and if the start of the gesture input is inside the boundary region and the completion of the gesture input is inside the first or second region, the gesture input is interpreted as starting in the first region if the gesture enters the first region after exiting the boundary region or is interpreted as starting in the second region if the gesture enters the second region after exiting the boundary region; performing a function in response to the interpreted gesture
- the method may further comprise displaying an output in the first touch sensitive region of the electronic device.
- the present invention a computer program product carrying a computer program embodied in a computer readable medium adapted to perform the aforementioned method.
- Fig. 1 is a schematic representation of a mobile telephone, as a first example of an electronic device in which the invention may be implemented;
- Fig. 2 is an architecture diagram of the Android operating system.
- Fig. 3 is a simplified schematic representation of a mobile telephone according to a first embodiment of the invention showing different regions on a front surface of the phone in a simplified manner;
- Figs. 4a and 4b show the predefined region of Fig. 3 shown in more detail
- Fig. 5a to 5e is a schematic representation of the phone in Fig. 3 showing the different types of gesture input and how these are interpreted by a processor;
- Figs. 6a to 6f show the different types of gesture that may be performed in the gesture control area of the phone in Fig. 3.
- Fig. 7a and 7b show diagrams which illustrate the distinction between horizontal and vertical gestures in the gesture control area 14 in the first embodiment.
- the mobile telephone has evolved significantly over recent years to include more advanced computing ability and additional functionality to the standard telephony functionality and such phones are known as "smartphones".
- many phones are used for text messaging, Internet browsing and/or email as well as gaming.
- Touchscreen technology is useful in phones since screen size is limited and touch screen input provides direct manipulation of the items on the display screen such that the area normally required by separate keyboards or numerical keypads is saved and taken up by the touch screen instead.
- touch input controlled electronic devices such as handheld computers without telephony processors, e-reader devices, tablet PCs and PDAs.
- Fig. 1 shows an exemplary mobile telephone handset, comprising a wireless communication unit having an antenna 101 , a radio signal transceiver 102 for two-way communications, such as for GSM and UMTS telephony, and a wireless module 103 for other wireless communication protocols such as Wi-Fi.
- An input unit includes a microphone 104 and a touchscreen 105 provides an input mechanism.
- An output unit includes a speaker 106 and a display 107 for presenting iconic or textual representations of the phone's functions.
- Electronic control circuitry includes amplifiers 108 and a number of dedicated chips providing ADC/DAC signal conversion 109, compression/decompression 10, encoding and modulation functions 111 , and circuitry providing connections between these various components, and a microprocessor 112 for handling command and control signalling.
- memory generally shown as memory unit 113.
- Random access memory in some cases SDRAM
- ROM and Flash memory for storing the phone's operating system and other instructions to be executed by each processor.
- a power supply 114 in the form of a rechargeable battery provides power to the phone's functions.
- the touchscreen 105 is coupled to the microprocessor 112 such that input on the touchscreen can be interpreted by the processor.
- SIM card Subscriber Identity Module
- IMSI user's service-subscriber key
- GSM Global System for Mobile Communications
- the SIM card typically stores the user's phone contacts and can store additional data specified by the user, as well as an identification of the user's permitted services and network information.
- the functions of a mobile telephone are implemented using a combination of hardware and software. In many cases, the decision on whether to implement a particular functionality using electronic hardware or software is a commercial one relating to the ease with which new product versions can be made commercially available and updates can be provided (e.g.
- a smartphone typically runs an operating system and a large number of applications can run on top of the operating system.
- the software architecture on a smartphone using Android operating system (owned by Google Inc.), for example, comprises object oriented (Java and some C and C++) applications 200 running on a
- Java-based application framework 210 and supported by a set of libraries 220 (including Java core libraries 230) and the register-based Dalvik virtual machine 240.
- the Dalvik Virtual Machine is optimized for resource-constrained devices - i.e. battery powered devices with limited memory and processor speed.
- Java class files are converted into the compact Dalvik Executable (.dex) format before execution by an
- the Dalvik VM relies on the Linux operating system kernel for underlying functionality, such as threading and low level memory management.
- the Android operating system provides support for various hardware such as that described in relation to Fig. 1. The same reference numerals for the same hardware appearing in Fig. 1 and 2 is used. Support can be provided for touchscreens
- Android supports various connectivity technologies (CDMA, WiFi, UMTS, Bluetooth, WiMax, etc) and SMS text messaging and MMS messaging, as well as the Android Cloud to Device Messaging (C2DM) framework. Support for media
- a first embodiment is shown of a front surface of a smartphone 10 having a housing comprising a touch sensitive area 11.
- the touch sensitive area 1 is provided along substantially the entire front surface of the handset 10 except for a boundary around the perimeter of the housing edges.
- the touch sensitive area 11 has a first zone 12 and a second zone 14.
- the functionality of the first zone 12 may be different to the functionality of the second zone 14.
- the first zone is a display screen 12 that is provided on the majority of the front surface and the second zone is a gesture control area 14.
- the display screen 2 is an area for applications on the phone to use as output but can also receive user input through user interaction with the touch sensitive area 11.
- buttons 13 are provided in a gesture control area 14 and the buttons 13 can be activated by a tapping user input.
- the gesture control area 14 is considered herein as an area that receives user input through user interaction with the touch sensitive area 11.
- the display screen 12 is larger than the gesture control area 14 and the gesture control area 14 does not have a display screen in this embodiment.
- the buttons are "fixed" in the sense that they are printed on the front surface of the phone and perform predetermined functions. It will be appreciated to the skilled person that a single button may be provided if required instead of a number of buttons.
- the button can be located anywhere in the gesture control area 14.
- the user interaction with the touch screen is through the user of gestures that can be executed by a user's finger or any other type of input device that can be detected by the touch sensitive area such as a stylus for example.
- the gestures can be any type of touch input or user interaction that is detectable by a touch sensitive area and capable of being interpreted by the microprocessor of the phone.
- the gesture can be a discrete touch or multiple touches of the touch sensitive area, continuous motion along the touch sensitive area (e.g. touch-and-drag operations that move an icon) or a combination thereof.
- the gestures may be distinguished by determining the location of the input on touch sensitive area and the direction and/or time the gesture is input on the touch sensitive area.
- a tap gesture can be distinguished from a long press gesture on the basis of the time that an input is held on the touch sensitive area.
- the display screen 12 displays information to the user of the phone which could be a plurality of graphical user interface objects (not shown) such as icons that can be tapped to activate programs related to the icons.
- graphical user interface objects such as icons that can be tapped to activate programs related to the icons.
- a web browser icon could be activated in the display screen 12 to activate a web page.
- the display screen 12 is capable of receiving a number of other gesture inputs which can invoke functionality on the phone. The gesture inputs can take place wholly within the display screen 12 or begin or end in the display screen 12.
- the gesture control area 14 is capable of receiving a number of gesture inputs which can invoke predefined functionality on the phone.
- the gesture inputs can take place wholly within the gesture control area 14 or begin or end in the gesture control area 15.
- the hard coded buttons 13 are distinguished from each other by icons 13a, 13b, 13c and the icons 13a, 13b, 13c are printed in the various locations of the gesture control area 14 to represent the various keys.
- Icon 13a relates to a Menu key
- icon 13b relates to a Home key
- icon 3c relates to a Back key.
- the icons may not extend the entire height H of the gesture control area 14 but the tapping above the icons 13a, 13b, 13c in respective extension areas 13ai , 13bi , 13ci still within the gesture control area 14 will cause the phone to carry out the functionality associated with the respective icon 13a, 13b, 13c.
- buttons 13 will cause the phone to perform predetermined functions such as go to the "Menu” screen , go to the "Home” screen or go “Back” one screen or one space in a text entry screen.
- the gestures could be a gesture occurring wholly within one of the gesture input areas, from one input area to the other or a gesture that is carried out simultaneously in both gesture input areas. For example, a two finger flick with the first finger in the display screen 12 and the second finger in the gesture control area 14.
- a long press on the display screen 12 could fix a point on the display screen and another type of gesture such as a drag in the gesture control area 14 may invoke particular functionality about the point on the display screen such as a zoom or rotate.
- the inventors have realised that there may be problems in distinguishing between the two different gesture input areas when gestures are performed very close to the boundary between the display screen 12 and the gesture control area 14. For example, a user may wish the phone to activate functionality that is associated with a sliding gesture being performed wholly within the gesture control area 14. If a user performs the gesture very near the display screen 12, it could be possible for the microprocessor of the phone to interpret the gesture as a function associated with a sliding gesture from the gesture control area 14 to the display screen 12 i.e. the command associated with a boundary crossing gesture.
- touch sensitive area is provided with a predefined "demilitarised" boundary region 15 which is a region having the effect of disregarding movement which is part of a gesture input that may occur in the region if the movement begins or ends in the region 15. It is a boundary or threshold that separates the display screen 12 and gesture control area 14 that must be crossed for an action associated with a gesture beginning in the gesture control area 14 and ending it the display screen 12 (or vice versa) to be carried out.
- the screen 12 and gesture control area 14 are adjacent each other and the boundary region 15 is between the screen 12 and the gesture control area 14 such that the screen 12 and gesture control area 14 could be considered as separated by this region 15. If the gesture does not cross the region 15, it is recognised as a gesture occurring wholly within the zone in which the gesture was started.
- the region 15 is made up of a number of rows of pixels, n, of the touch sensitive area 1 1 .
- the region 15 has an upper threshold 15a and lower threshold 15b. If a gesture is started or finished between these thresholds or extremities it can be considered to fall within the region 15.
- a possible user finger touch T1 is shown in fig. 4a for a gesture that is considered to start within the region 15 since the centre of the finger touch by a user falls just below the upper threshold 15a but the not all of the finger touch (represented by the larger circle) is within the region 15. This can still be interpreted as a touch falling within the region 5. Since the upper threshold 15a is crossed for a gesture in direction A, the gesture is considered to start in the display screen 12 which is located above the region 15a.
- the entire finger touch T2 falls within the region 15 and starts within this region.
- a gesture is performed in direction B that crosses the lower threshold 15b and therefore the gesture is considered to start in the gesture control area 14 which is located below the lower threshold 15b. Accordingly, the start of the gesture in the region 15 is disregarded.
- a completion of a gesture is considered to occur in the display screen 12 or gesture control area 14 depending on the threshold 15a, 15b of the region that is crossed when entering the region 5.
- the microprocessor of the phone (as mentioned hereinbefore in relation to Fig. 1) will determine the location of the starting position or end position of the gesture and if this falls within the location of the region 15, part of the gesture which is in the region 15 will be ignored. In particular, in this embodiment, it is determined if the start or end of the gesture is inside the region 5. If it is inside, this is interpreted as the entire gesture being not at all in the region 15. The start or end point of the gesture that is in the region 15 is positioned just outside the region 15, in the display screen 12 or gesture control are 14 depending on the path of the gesture and where the gesture entered or exited the region 15. The gesture is "rewritten" as being entirely outside the DMZ, and the appropriate functionality associated with the rewritten gesture is carried out.
- a variable size boundary region is provided based on a preliminary gesture recognition.
- the region can be a variable number of pixels, or a single row.
- the size of the boundary region 15 may depend in the type of gesture trying to be interpreted. Further, the number of rows of pixels for the boundary region may vary across the width of the touch sensitive area 1 if different levels of sensitivity are desired for the region.
- the region would be smaller in the middle and larger near the edge of the region. That is, the height of the region would be smaller in the middle and larger at the edges.
- the region 15 may or may not be visible to a user of the phone as a distinct boundary extending across the width or any distance in the touch sensitive area 1 1 between the display screen 12 and the gesture control area 14.
- Fig. 5a shows, on the left side, a first type of gesture that may be performed on the phone shown in Fig. 3.
- This gesture is diagonal drag where the fingertip of a user is moved along the surface of the touch sensitive area 1 from the display screen 12 to the gesture control area 14 without losing contact with the screen and this gesture is indicated by dashed arrow C.
- dashed arrow C As shown on the right side, if a gesture begins within display screen 12, crosses region 5 and ends in gesture control area 14 or vice versa, the functionality associated with the gesture of moving from one area to the other will be carried out since the region 15 has been positively crossed. It will be appreciated that this could be any number of functions and will be dependent on the particular functionality that is required.
- Fig. 5b shows, on the left side, a second type of gesture. This is a drag upward where the fingertip of a user is moved along the surface of the touch sensitive area 11 from the gesture control area 14 to the display screen 12 without losing contact with the touch sensitive area 1 1 and this gesture is indicated by dashed arrow D. As shown on the right side, the functionality associated with the gesture of moving up from area 1 to the other area 12 will be carried out since the region 15 has been positively crossed.
- Fig. 5c shows, on the left side, a diagonal drag gesture similar to the gesture in Fig. 5a in that it starts in the display screen 12. However, this is different to Fig. 5a in that the drag (indicated by arrow E) ends in the region 15.
- the processor determines that the input has commenced in the display screen 12 but has ended in the region 15 and ignores the part of the gesture that has been performed in the region 5.
- the gesture is judged as a diagonally downwards drag gesture taking place wholly within the display screen 12 as shown on the right side of the figure.
- the function associated with a diagonally downwards drag gesture in the display screen is performed by the phone and this could be a different function to that which is performed in response to the gesture in Fig. 5a.
- Fig. 5d shows, on the left side, a drag upward gesture similar to the gesture in Fig. 5a in that it starts in the gesture control area 14. However, this is different to Fig. 5b in that the drag (indicated by arrow F) ends in the region 15.
- the processor determines that the input has commenced in the gesture control area 14 but has ended in the region 15 and ignores the part of the gesture that has been performed in the region 15.
- the gesture is judged as an upward drag gesture taking place wholly within the gesture control area 14 as shown on the right side of the figure.
- the function associated with an upward drag gesture in the gesture control area 14 is performed by the phone and this could be a different function to that which is performed in response to the gesture in Fig. 5b. For example, the crossing of the region 15 from the gesture control area could bring up a keyboard for the user to enter text whereas the upward drag wholly within the gesture control area could invoke another function such as a phone lock screen, for example or no function at all.
- Fig. 5e shows, on the left side, a drag upward gesture similar to the direction of the gesture in Fig. 5d.
- the drag (indicated by arrow G) starts in the region 15 and ends in the display screen 12.
- the processor determines that the input has commenced in the region 15 and has ended in the display screen 12 and ignores the part of the gesture that has been performed in the region 15.
- the gesture is judged as an upward drag gesture taking place wholly within the display screen 12 as shown on the right side of the figure.
- the function associated with an upward drag gesture in the display screen 12 is performed by the phone and this could be a different function to that which is performed in response to the gesture in Fig. 5d.
- a boundary region 15 is useful to ensure that the phone does not inadvertently perform a function associated with a command gesture of Fig. 5b, for example, when the intention of the user is for the phone to carry out a function associated with a command gesture shown in Fig. 5d, for example.
- the boundary region 15 is useful where there are two touch screen areas and gestures are being performed near the boundary between the two touch screen areas.
- the boundary region 15 provides a tolerance for touch input gestures in a touch sensitive device with two distinct touch input areas and results in an improved and more reliable user interaction with the electronic device.
- gestures can be performed in the display screen 12 and the gesture control area 14.
- the following gestures can be recognised when input with a finger or other input means which can provide the appropriate functionality: tap; press; drag (single and double finger); flick (single and double finger); pinch; and spread.
- the various types of gesture that can be performed in the gesture control area 14 are explained further below with reference to Figs. 6 and 7.
- the processor will compare the gesture that is carried out with the predetermined types of gesture that are recognisable and execute the relevant functionality associated with the gesture if the gesture is recognised.
- Fig. 6a Briefly touch the surface of the gesture control area 14 with a fingertip. Recognised as a tap as long as finger is removed from the surface before a predetermined time (eg. 1 .49 seconds). Movement is tolerant to a predetermined diameter (eg. 1 cm) from original recognition point of touch input.
- Press Fig. 6b: Touch the surface of the gesture control area 14 with a fingertip for an extended period of time. Recognised as a press as long as finger is held on surface for at least a predetermined time (eg. 1.5 seconds). Movement is tolerant to a predetermined diameter (eg. 1 cm) from original recognition point of touch input..
- Drag (Fig. 6c): Move fingertip over the gesture control area 14 without losing contact. Can be performed in directions left, right, up. Horizontal movement recognised staying between 50 degrees to 130 degrees and 230 degrees to 310 degrees. Vertical movement recognised between 315 degrees and 345 degrees. Movement should be greater than the tolerance for tap and press movement i.e greater than 1 cm.
- Flick Quickly brush gesture control area 14 with fingertip. Can be performed in directions left, right, up. Horizontal movement recognised staying between 50 degrees to 130 degrees and 230 degrees to 310 degrees. Vertical movement recognised between 315 degrees and 345 degrees. Movement should be greater than the tolerance for tap and press movement i.e greater than 1 cm.
- Two finger drag (Fig. 6e): Move two fingertips over the gesture control area 14 without losing contact. Can be performed in up direction. Two fingers detected with movements between 315 degrees and 345 degrees.
- Two finger flick (Fig. 6f): Quickly brush gesture control area 14 with two fingertips. Can be performed in up direction. Two fingers detected with movements between 315 degrees and 345 degrees.
- a drag gesture indicated by solid arrow G which is at an angle of 20 degrees would be interpreted as a vertical movement upwards since the movement falls between 315 degrees and 45 degrees.
- Fig. 7b shows that a drag gesture indicated by solid arrow H which is at an angle of 80 degrees would be interpreted as a horizontal movement right since the movement falls between 50 degrees and 130 degrees. Although not shown, it will be understood that a movement at an angle of between 230 degrees and 310 degrees would be considered a horizontal movement left.
- the thresholds for the tolerance can be varied depending on the tolerance required. However, it is found that for the gesture control area of this embodiment, the angles set as tolerances are appropriate for producing desirable results. Since the gesture control area 14 is of a small size such as a strip having the height of a few pixels and is only used for input rather than display of graphical objects that continuously change and may need to be moved, tolerances can be larger compared to the display screen 12 where fine gesture control may be required and the tolerances would be lower.
- the two adjoining distinct zones need not be part of the same touch sensitive area or panel as each could still function in the desired manner if each is associated with a different touch sensitive area or panel and the appropriate modifications are made in order to recognise gesture inputs from each zone.
- a boundary region is provided in the adjoining boundary of the zones in which part of a gesture that starts or ends in the region is not recognised as part of the overall gesture when interpreted.
- the boundary region will comprise part of the first zone that is near the second zone and part of the second zone that is near the first zone. This could be a variable number of pixels of each zone which would be considered the boundary region.
- the processor referred to herein may comprise a data processing unit and associated program code to control the performance of operations by the processor.
- buttons 13 in the gesture control area may be dynamic and change with context rather than being fixed in functionality as in the first embodiment.
- the buttons in the gesture control area 14 may be a graphical object representing a virtual button that is activated with a finger tap that may or may not provide haptic feedback to a user when the button is pressed and/or may be a physical button.
- the second zone may be positioned in a different location on the touch sensitive area with respect to the first zone of the electronic device.
- the second zone may be positioned above the first zone.
- a boundary region can still be provided between the first and second zones in such a configuration or other configurations.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
La présente invention porte sur un dispositif électronique destiné à recevoir une entrée gestuelle. Le dispositif comprend : une première région tactile servant à recevoir une entrée gestuelle de l'utilisateur et une seconde région tactile servant à recevoir une entrée gestuelle de l'utilisateur, un processeur pour interpréter l'entrée gestuelle de l'utilisateur ; une région frontière placée entre la première région tactile et la seconde région tactile. Si un début de l'entrée gestuelle se trouve à l'intérieur de la première ou seconde région tactile et l'achèvement de l'entrée gestuelle se trouve à l'intérieur de la région frontière, le processeur est conçu pour interpréter que le geste a été achevé dans la première ou seconde région tactile en provenance de laquelle le geste est entré dans la région frontière, et si le début de l'entrée gestuelle se trouve à l'intérieur de la région frontière et l'achèvement de l'entrée gestuelle se trouve à l'intérieur de la première ou seconde région tactile, le processeur est conçu pour interpréter que l'entrée gestuelle a commencé dans la première ou seconde région tactile dans laquelle le geste est entré après être sorti de la région frontière. La présente invention porte également sur un procédé d'interprétation d'entrée gestuelle d'un utilisateur d'un dispositif électronique, et sur un produit programme d'ordinateur contenant un programme d'ordinateur incorporé dans un support lisible par ordinateur conçu pour mettre en œuvre le procédé.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1101133.5 | 2011-01-21 | ||
GB201101133A GB2487425A (en) | 2011-01-21 | 2011-01-21 | Gesture input on a device a first and second touch sensitive area and a boundary region |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012098361A1 true WO2012098361A1 (fr) | 2012-07-26 |
Family
ID=43769474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2012/000054 WO2012098361A1 (fr) | 2011-01-21 | 2012-01-20 | Appareil et procédé pour interaction d'utilisateur améliorée dans des dispositifs électroniques |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2487425A (fr) |
WO (1) | WO2012098361A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103809894A (zh) * | 2012-11-15 | 2014-05-21 | 华为终端有限公司 | 一种手势的识别方法及电子设备 |
CN107690619A (zh) * | 2015-06-05 | 2018-02-13 | 苹果公司 | 用于处理触敏表面多个区域上触摸输入的设备和方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108279834A (zh) * | 2014-09-29 | 2018-07-13 | 联想(北京)有限公司 | 一种控制方法和装置 |
KR102411283B1 (ko) * | 2017-08-23 | 2022-06-21 | 삼성전자주식회사 | 사용자 인터페이스의 입력 검출 영역을 설정하기 위한 방법 및 그 전자 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877766A (en) * | 1997-08-15 | 1999-03-02 | International Business Machines Corporation | Multi-node user interface component and method thereof for use in accessing a plurality of linked records |
US6295049B1 (en) * | 1999-03-03 | 2001-09-25 | Richard T. Minner | Computer system utilizing graphical user interface with hysteresis to inhibit accidental selection of a region due to unintended cursor motion and method |
WO2009137419A2 (fr) | 2008-05-06 | 2009-11-12 | Palm, Inc. | Zone de commande tactile étendue pour dispositif électronique |
US20090322689A1 (en) * | 2008-06-30 | 2009-12-31 | Wah Yiu Kwong | Touch input across touch-sensitive display devices |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02112013A (ja) * | 1988-10-21 | 1990-04-24 | Toshiba Corp | タッチパネル式入力装置 |
KR20100078295A (ko) * | 2008-12-30 | 2010-07-08 | 삼성전자주식회사 | 이종의 터치영역을 이용한 휴대단말의 동작 제어 방법 및 장치 |
-
2011
- 2011-01-21 GB GB201101133A patent/GB2487425A/en not_active Withdrawn
-
2012
- 2012-01-20 WO PCT/GB2012/000054 patent/WO2012098361A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877766A (en) * | 1997-08-15 | 1999-03-02 | International Business Machines Corporation | Multi-node user interface component and method thereof for use in accessing a plurality of linked records |
US6295049B1 (en) * | 1999-03-03 | 2001-09-25 | Richard T. Minner | Computer system utilizing graphical user interface with hysteresis to inhibit accidental selection of a region due to unintended cursor motion and method |
WO2009137419A2 (fr) | 2008-05-06 | 2009-11-12 | Palm, Inc. | Zone de commande tactile étendue pour dispositif électronique |
US20090322689A1 (en) * | 2008-06-30 | 2009-12-31 | Wah Yiu Kwong | Touch input across touch-sensitive display devices |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103809894A (zh) * | 2012-11-15 | 2014-05-21 | 华为终端有限公司 | 一种手势的识别方法及电子设备 |
CN103809894B (zh) * | 2012-11-15 | 2017-06-27 | 华为终端有限公司 | 一种手势的识别方法及电子设备 |
CN107690619A (zh) * | 2015-06-05 | 2018-02-13 | 苹果公司 | 用于处理触敏表面多个区域上触摸输入的设备和方法 |
US10474350B2 (en) | 2015-06-05 | 2019-11-12 | Apple Inc. | Devices and methods for processing touch inputs over multiple regions of a touch-sensitive surface |
CN110568965A (zh) * | 2015-06-05 | 2019-12-13 | 苹果公司 | 用于处理触敏表面多个区域上触摸输入的设备和方法 |
CN110568965B (zh) * | 2015-06-05 | 2023-05-30 | 苹果公司 | 用于处理触敏表面多个区域上触摸输入的设备和方法 |
Also Published As
Publication number | Publication date |
---|---|
GB201101133D0 (en) | 2011-03-09 |
GB2487425A (en) | 2012-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3336678B1 (fr) | Procédé et dispositif électronique pour empêcher le faux déclenchement d'un bouton tactile | |
US10367765B2 (en) | User terminal and method of displaying lock screen thereof | |
KR102109617B1 (ko) | 지문 인식기를 구비한 단말 및 이 지문 인식기를 통한 사용자 입력의 처리 방법 | |
US10551987B2 (en) | Multiple screen mode in mobile terminal | |
US9965158B2 (en) | Touch screen hover input handling | |
EP2684115B1 (fr) | Procédé et appareil pour assurer l'accès rapide à des fonctions de média à partir d'un écran verrouillé | |
KR102168648B1 (ko) | 사용자 단말 장치 및 그 제어 방법 | |
EP2523070A2 (fr) | Traitement d'entrée pour mise en correspondance de caractères et mise en correspondance de mots prédits | |
US20170351404A1 (en) | Method and apparatus for moving icon, an apparatus and non-volatile computer storage medium | |
US20140306897A1 (en) | Virtual keyboard swipe gestures for cursor movement | |
US20140152585A1 (en) | Scroll jump interface for touchscreen input/output device | |
TWI675329B (zh) | 資訊圖像顯示方法及裝置 | |
KR101251761B1 (ko) | 어플리케이션 간 데이터 전달 방법 및 이를 이용하는 단말 장치 | |
EP3336679A1 (fr) | Procédé et terminal pour empêcher le déclenchement accidentel d'une touche tactile et support d'informations | |
US8935638B2 (en) | Non-textual user input | |
WO2018068328A1 (fr) | Terminal et procédé d'affichage d'interface | |
US9383858B2 (en) | Method and device for executing an operation on a mobile device | |
CA2884202A1 (fr) | Activation d'un dispositif electronique a l'aide d'un clavier capacitif | |
CN107943359A (zh) | 一种应用程序的控制的方法、终端及计算机可读介质 | |
WO2012098361A1 (fr) | Appareil et procédé pour interaction d'utilisateur améliorée dans des dispositifs électroniques | |
KR102180404B1 (ko) | 사용자 단말 장치 및 그 제어 방법 | |
EP2899623A2 (fr) | Appareil et procédé de traitement d'informations et programme | |
WO2012098360A2 (fr) | Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées | |
KR20130042913A (ko) | 터치 프로세스를 처리하는 방법, 장치 및 기록매체 | |
US20150153925A1 (en) | Method for operating gestures and method for calling cursor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12704100 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12704100 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12704100 Country of ref document: EP Kind code of ref document: A1 |