WO2007082290A2 - Interface d'utilisateur pour dispositif informatique à base d'écran tactile et procédé correspondant - Google Patents

Interface d'utilisateur pour dispositif informatique à base d'écran tactile et procédé correspondant Download PDF

Info

Publication number
WO2007082290A2
WO2007082290A2 PCT/US2007/060435 US2007060435W WO2007082290A2 WO 2007082290 A2 WO2007082290 A2 WO 2007082290A2 US 2007060435 W US2007060435 W US 2007060435W WO 2007082290 A2 WO2007082290 A2 WO 2007082290A2
Authority
WO
WIPO (PCT)
Prior art keywords
screen
icons
impact zone
touch
determining
Prior art date
Application number
PCT/US2007/060435
Other languages
English (en)
Other versions
WO2007082290A3 (fr
Inventor
Yaroslav Novak
Original Assignee
Motorola, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola, Inc. filed Critical Motorola, Inc.
Publication of WO2007082290A2 publication Critical patent/WO2007082290A2/fr
Publication of WO2007082290A3 publication Critical patent/WO2007082290A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a method for improving the usability of a user interface (UI) of a touch-screen based computing device.
  • UI user interface
  • the invention is applicable to, but not limited to, user interface for mobile computing devices utilising a touchscreen.
  • Touch-screen (sometimes referred to as touch-panel) displays are known for use in electronic devices for various applications. Such displays show an image of a number of character buttons or functional buttons. If a user touches the panel where one of the buttons is shown an internal sensor detects that the user has selected that particular button and indicates the selection to an internal electronic controller that executes a corresponding function.
  • Touch-screen displays have been used mainly in applications such as automatic teller machines, in which the users are the general public who may not be used to computer operations.
  • applications are emerging in which touch panel displays are used to provide operations and functions equivalent to those obtained with a personal computer keyboard or mouse pointer.
  • touch screen displays are known which display icons, option keys or the like that are used, for example, to indicate software programs or other functional applications which may be selected to run on a digital processor of the device.
  • Touch-screen displays are predominantly used in computing devices or microprocessor control devices comprising a user interface. Smaller versions of these computing devices have found particular use in applications where the use of a full- sized keypad or keyboard may be impractical or impossible. Touch-screen based computing devices have also been used to improve the User Interface (UI) of devices in which the keypad is small and/or has a limited functionality. Touch-screen based computing devices have also found great use in applications that require user input from a menu-based format, as the user is able to quickly select menu options from a range of displayed icons.
  • UI User Interface
  • the screens of mobile/hand-portable computing devices are typically limited in size by the nature of the device, e.g. a Personal Digital Assistant (PDA).
  • the screen size may be relatively large, as is found in some laptop personal computer (PC) or PC tablet devices.
  • the touch-screen is used to display a number of icons, icons or text fields that are associated with some device functionality, such as a text editor or a calendar application, for example.
  • the term 'icon' will be used to encompass icons, control elements, text fields, etc. of the image displayed to represent a particular functionality of the touch-screen based device.
  • An icon is thus a "shortcut" to an application or function or further screen displaying further options. For example, a user may activate the icon by touching an area of the screen displaying the icon, which causes the associated application to execute. Icons may also be associated with some functionality of the computing device itself.
  • text fields may also be associated with a specific application, such as a text editor or browser.
  • the text fields may activate a specific functionality of an application that is already executing on the device.
  • the text field could be an item in a drop down menu, for example.
  • the touch screen display may be provided on a handset, e.g. for wireless communications.
  • a stylus or like device is often employed to assist the user to navigate through the program applications or options available, and to enter data and provide control instructions via the touch panel.
  • working with a stylus is not ideal.
  • the stylus has first of all to be retrieved to begin operation using it. It may easily be mislaid or lost. Operation of the stylus on the touch screen of the touch panel display has to be gentle to avoid damage to the touch screen.
  • buttons or icons are selected by a user's finger in at least some applications of the device.
  • displayed items for selection may need to be close together, selection of one item may overlap with selection of another unwanted item. This problem may be noticed particularly if the user is wearing gloves, e.g. because the user is performing a specialist job such as in police work or in fire-fighting. This problem of producing an overlap in the items selected items has been recognised in the prior art.
  • the solution which has been proposed is based on a predictive algorithm.
  • the algorithm detects what selections have already been made by the user and predicts from such selections what is the selection currently likely to be intended by the user. Unfortunately, such an algorithm is of little use if no prior selections have already been made by the user.
  • a touchscreen based computing device as claimed in Claim 10.
  • a method for determining a command provided via activation of an onscreen icon of a touch-screen based computing device comprises detecting activation by a user of the touch-screen, determining a position and/or dimension of an impact zone, and determining whether multiple on-screen icons are associated with the impact zone. If only one on-screen icon is associated with the area, that particular icon is activated. However, if multiple on-screen icons arc associated with the impact zone, the process displays a helper-screen comprising the multiple icons and a step of determining a position and/or dimension of a subsequent impact zone relating to the icons displayed on the helper screen is performed.
  • the process of displaying helper-screens is subsequently repeated until a single desired icon is determined as being activated, or the process cancelled.
  • UI user interface
  • a dimension of each icon displayed in the helper- screen is scaled, to improve the ability of a microprocessor within the computing device of determining a single icon that the user desires to activate.
  • the multiple icons may be scaled to one or more similar dimension(s) of the initial impact zone.
  • the size of the icons being displayed on the helper screen can be scaled, such that the icons are substantially the same size as the tool being used to activate them on the screen, e.g. a finger end or a pen for example.
  • the user may readily select a desired icon from the helper screen, with a reduced risk of selecting multiple icons, as the icon size is similar to the impact caused by the activating mechanism.
  • the scaling of one or more dimension(s) of each of the multiple icons displayed in the helper-screen is performed such that one or more dimension(s) of the multiple icons is/are larger than one or more of the dimension(s) of the impact zone.
  • the multiple icons are larger than the tool being used to activate the screen, such as a finger end or pen.
  • the user has little difficulty selecting the desired icon, even when, say, using a gloved finger.
  • This beneficial feature may also be automatically activated when the impact zone increases above a certain size, implying that a large object is being used to activate applications via the touch-screen.
  • a degree of overlap between an impact zone and areas associated with each of the on-screen icons is determined. Any icon for which the determined degree of overlap is less than a threshold level is ignored.
  • This 'filter' mechanism advantageously reduces the number of icons displayed on the helper screen, by determining and eliminating applications that it identifies were most probably activated in error by the user.
  • a touchscreen based computing device comprising a user interface having a touch-screen input device and a microprocessor, operably coupled to the touch screen input device, arranged to detect an activation of an on-screen icon; and determine an impact zone associated with the activation.
  • the microprocessor determines whether a single on- screen icon or multiple on-screen icons is/are associated with the impact zone; and displays on the touch-screen, in response to determining multiple on-screen icons, a helper-screen comprising multiple icons associated with the multiple on-screen icons of the impact zone.
  • the helper-screen occupies only a fraction of an entire touch-screen area, thus allowing other information relevant to the user to continue to be displayed on the main screen. In this manner, the user is able to select the required application from the helper-screen, whilst simultaneously monitoring the main screen.
  • the computing device advantageously supports at least two modes of operation, a regular mode in which the command determination feature is disabled, and an overlap mode in which the feature is enabled.
  • a regular mode in which the command determination feature is disabled
  • an overlap mode in which the feature is enabled.
  • the method comprises determining those icons that are selected and subsequently displaying these icons to the user in a further helper screen. The user is then able to start the application or command via pressing the relevant icon on the helper-screen.
  • FIG. 1 illustrates a computing device adapted in accordance with one embodiment of the present invention showing a number of icons and impact zones;
  • FIG. 2 illustrates a computing device adapted in accordance with one embodiment of the present invention, showing the helper-screen and four icons corresponding to elements selected by one of the impact zones of FIG. 1;
  • FIG. 3 illustrates a flow chart describing a method in accordance with one embodiment of the present invention.
  • any touch screen device such as a personal digital assistant (PDA), MP-3 player or mobile telephone.
  • PDA personal digital assistant
  • any reference to computing device is meant to encompass any user-interface equipment that is capable of using a touch-screen as a means of user interaction.
  • selection of an 'icon' is meant to encompass any selection where a choice exists on a user interface, irrespective of whether a 'button' or 'menu' presentation mechanism is used, e.g. the selection of an active documents may encompass a choice between two overlapping displayed documents.
  • FIG. 1 shows a computing device 100 with touch screen display 160, and icons 122, 124, 126, etc.
  • the icons are graphical icons representing software applications or functions stored in memory on the computing device 100.
  • the graphical icons may represent software applications or functions accessible by the computing device 100 via a wireless network, for example.
  • a microprocessor 170 on the touch-screen based computing device 100 interprets the inputs entered by the touch screen and determines how to proceed, e.g. activating a function, operation or element or generating and displaying a helper screen.
  • the touch screen and the microprocessor 170 may, for example, be operably coupled to a sensor (not shown), such that the sensor senses a local change or variation in an electrical property caused by touch by a user's finger.
  • the property changed may be electrical capacitance, conductivity or resistivity.
  • the touch screen may for example be a screen which senses a local change or variation in an electrical property, such as conductivity or resistivity, due to a pressure change when a selected area corresponding to a displayed icon is touched by a user, to indicate selection of the corresponding function. Any other known form of touch screen may alternatively be used.
  • the applications or functions are executed by a user physically touching an area of the screen or display occupied by a respective icon.
  • the display hardware is configured to detect any impact (touch), determine a position of the touch (impact zone) on the screen, and make this information available to applications or firmware running on, say, the microprocessor 170 of the computing device 100.
  • the icons themselves are generated, or read out of memory, by the microprocessor 170 of the computing device 100 as required, and displayed on the touch screen 160.
  • the whole screen and its contents define the user interface (UI) of the computing device 100.
  • Touch-sensitive elements and circuitry are known in the art. A skilled artisan appreciates how such elements and circuitry can be applied in implementing the inventive concept herein described and therefore the structure of touch-sensitive elements and circuitry will not be described further here.
  • the circular areas 110, 130, 150 shown in FIG. 1 represent somewhat idealised impact zones on the touch-screen 160 of the computing device 100.
  • Each of these impact zones corresponds to an example of a physical touch on the screen by an object such as a user's finger end. In reality these areas will have a more irregular form and are shown as circular for clarity purposes only.
  • the impact zone represents an attempt by the user of the computing device 100 to execute a command, or start an application, or otherwise access a functionality of the computing device 100 by means of an icon on the screen.
  • the mechanism used to touch the screen may well have been larger than the icon corresponding to the icon of the required application or function.
  • a user's impact zone 150 has wholly overlapped icon 154, substantially overlapped icon 152 and partially overlapped icon 156. In this case, it is possible that an incorrect application may be started unless the required application can be identified.
  • the user's impact zone 110 has substantially overlapped icons 112, 114 and partially overlapped icon 116, 118. Again, in this case, it is possible that an incorrect application may be started unless the required application can be identified.
  • the user's impact zone 130 has wholly overlapped a single icon 132. In this case, it is likely that the correct application corresponding to icon 132 will be started.
  • firmware say in the form of a microprocessor 170 executing on the hardware of the computing device 100, receives data identifying an impact zone (i.e. a user activation area on the touch screen).
  • the indication may be in the form of one or more dimension(s) of the impact zone, for example a list of pixels on the touch screen that have been activated, or length and width data of the impact zone, or a central location of the impact zone, with a radius of the impact area (in a circular form).
  • the microprocessor 170 compares a position of the impact zone, say impact zone 110, with the known position of the icons 112, 114, 116 and 118. For example, in the context of the present invention, a location area of the impact zone may be identified and compared with the known location areas of the icons in the vicinity of the impact zone.
  • the microprocessor 170 then ascertains whether more than one icon has been selected by the impact.
  • buttons may also be displayed, such as soft function keys, allowing the user to cancel a command or access some other function of the computing device 100, according to the applications or functions being supported.
  • FIG. 2 corresponds to the second example of FIG. 1 , whereby four icons 112, 114, 116, 118 corresponding to impact zone 110 were selected.
  • the helper-screen 230 is generated by a microprocessor 170 of the computing device 100 and is displayed on either a subsection of the touch screen 160 or the whole screen.
  • the area of the screen used to display the helper-screen 230 is user-selectable/user-defmable.
  • a user may be able to define the area to be made proportional to the number of overlapping elements.
  • the respective size of the icons displayed on the helper screen may be configured as dependent upon the level of overlap identified when comparing the impact zone with the icon area.
  • some other criterion could be used.
  • the size of the icons to be displayed on the helper screen 230 may be user-selectable/user-definable. In this manner, the user is able to better manage its UI. For example, a user whose hands tremble may decide to utilise a large area or the whole area of the screen 160 to display the helper screen 230, to more readily select the desired icon. In contrast, a user with better hand control may desire to limit the display area of the helper screen 230 so that (s)he can view other functions/applications that are running on other sections of the touch screen 160.
  • helper screen 230 is shown that covers the whole of the screen area of the computing device 100.
  • This helper screen 230 displays four icons, corresponding to the icons 112, 114, 116, 118 that were determined as overlapping with the original impact zone 110.
  • the user can now confirm the selection of the required icon, say icon 114, by touching the helper-screen display in the area of that element.
  • the function associated with that icon 114 is executed or activated.
  • the helper screen 230 is then configured to disappear and the device then returns to its normal operation.
  • the computing device 100 may display information associated with the function activated by the icon, if any.
  • the impact zone overlaps with a number of icons 112, 114, 116 and 118, with each overlapping to varying degrees.
  • a microprocessor 170 within, say, the firmware of the computing device 100 calculates a degree of overlap between the impact zone 1 10 and the individual icons 112, 114, 116 and 118.
  • the microprocessor 170 may do this by interfacing directly with the hardware of the touch screen.
  • the interface is performed via a device driver.
  • the microprocessor 170 reads data from a memory element within the computing device 100, where the data corresponds to the impact zone and/or the icon positions and sizes (e.g. area of screen occupied).
  • the microprocessor 170 determines a degree of overlap of each icon 1 12, 1 14, 1 16 and 118 with the impact zone 110, and then compares each degree of overlap with a threshold level. The microprocessor 170 is then able to determine those icons to display in the helper-screen 230 where the degree of overlap exceeded the threshold level.
  • the threshold level is set such that icon 118 is deemed to have been activated by accident, i.e. only a small fraction of the total area occupied by icon 118 lies within the impact zone 110.
  • the microprocessor 170 determines that icon 118 is not to be displayed on the helper screen 230, thus further simplifying the selection process by allowing more space to display the remaining three icons 112, 114 and 116.
  • the sizes of the icons 112, 114 and 116 to be displayed on the helper screen 230 are also scaled so as to be proportional to the original impact zone 110.
  • the user of the computing device 100 may set the scaling factor.
  • the scaling factor may be generated automatically by the computing device itself, for example according to a pre-defined function or rule.
  • the icons displayed in. the helper screen.230 can thus be made significantly larger than the original icons 112, 114 and 116, thereby allowing a single icon, i.e. desired icon 114, to be more readily selected.
  • the icons 112, 114 and 116 may also be scaled such that they occupy a similar area of screen, as did the initial impact zone 110.
  • the icons 112, 114 and 116 may be scaled, such that they occupy a larger area than did the initial impact zone 110.
  • the icons of the helper screen 230 may be scaled such that they are as large as, or larger than, the initial impact zone 110 made by the gloved finger.
  • the microprocessor 170 retains the size of the icons 112, 114 and 116, from the first activation. However, in order to improve the likelihood of the user activating the desired icon (e.g. icon 114) upon the next activation, the microprocessor 170 enlarges the spacing between the icons. For example, the microprocessor 170 may, in one embodiment, arrange the spacing between the icons such that no two icons arc displayed within an area as small as, or smaller than, the initial impact zone 110. Of course, the number of icons selected for subsequent selection via the helper screen, and the absolute size of the helper screen, limit the size and spacing of the icons.
  • a helper screen 230 i.e. upon a second or further activation
  • the aforementioned method of the invention is carried out again, in order to further reduce the number of icons displayed.
  • a further help-screen is generated, with a further reduction in the number of icons displayed. The user then selects the desired icon 114, from this reduced number of icons.
  • a flow chart 300 describes a method in accordance with one embodiment of the present invention.
  • the method commences when the screen is activated, in step 305.
  • a microprocessor within the computing device determines a position and dimensions of the activated impact zone, as shown in step 310.
  • the microprocessor determines whether one or more on-screen icon(s) is/are located within the impact zone, as in step 315.
  • step 315 the process returns to step 305 to determine whether the screen is activated. If the microprocessor determines that one or more on-screen icon(s) is/are located within the impact zone, in step 315, the microprocessor then determines whether multiple on-screen icon(s) is/are located within the impact zone, as in step 320. Tf multiple on-screen icons are not located within the impact zone, in step 320, the single selected icon is activated, as shown in step 330.
  • the microprocessor generates and displays a number of the multiple on-screen icons on a helper screen, as shown in step 325.
  • the microprocessor uses larger icons.
  • the microprocessor uses a greater spacing between the icons used to represent the multiple icons, as in step 325. The process then returns to step 305, waiting for a further user input on the helper screen to select the desired icon or further narrow down the selection.
  • the present invention is described in terms of a UI for a computing device. However, it will be appreciated by a skilled artisan that the inventive concept herein described may be embodied in any type of UI for a touch-screen device.
  • inventive concept can be applied by a semiconductor manufacturer to any user interface. It is further envisaged that, for example, a semiconductor manufacturer may employ the inventive concept in a design of a stand-alone user interface for a computing device or application-specific integrated circuit (ASIC) and/or any other sub-system element.
  • ASIC application-specific integrated circuit
  • a computing device having a touch screen is configured such that an impact on the touch screen, which overlaps with a number of icons, is detected by a microprocessor 170 in the device.
  • the microprocessor 170 arranges for a number of the multiple icons to be subsequently displayed on a helper screen, the user then being able to select the desired icon, from the number of the multiple icons, by the user touching the helper screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un procédé pour la détermination d'une commande transmise via un dispositif de saisie à écran tactile d'un dispositif informatique comprenant la détection (300) d'une activation d'une icône affichable à l'écran; et la détermination (310) d'une zone d'impact associée à ladite activation. Le procédé comprend également les étapes de détermination (320) de l'association ou non d'une unique icône affichable à l'écran ou d'une pluralité d'icônes affichables à l'écran (112, 114, 116, 118) à ladite zone d'impact (110); et l'affichage, en réponse à la détermination d'une pluralité d'icônes affichables à l'écran (112, 114, 116, 118), d'un écran auxiliaire (230) comportant ladite pluralité d'icônes (220) associées à ladite pluralité d'icônes affichables à l'écran (112, 114, 116, 118) de la dite zone d'impact.
PCT/US2007/060435 2006-01-12 2007-01-12 Interface d'utilisateur pour dispositif informatique à base d'écran tactile et procédé correspondant WO2007082290A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0600548.2 2006-01-12
GB0600548A GB2434286B (en) 2006-01-12 2006-01-12 User interface for a touch-screen based computing device and method therefor

Publications (2)

Publication Number Publication Date
WO2007082290A2 true WO2007082290A2 (fr) 2007-07-19
WO2007082290A3 WO2007082290A3 (fr) 2008-04-10

Family

ID=35997889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/060435 WO2007082290A2 (fr) 2006-01-12 2007-01-12 Interface d'utilisateur pour dispositif informatique à base d'écran tactile et procédé correspondant

Country Status (2)

Country Link
GB (1) GB2434286B (fr)
WO (1) WO2007082290A2 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101472591B1 (ko) * 2008-11-14 2014-12-17 삼성전자주식회사 컨텐츠의 확대 영역 선택 방법, 컨텐츠 제공 장치 및 시스템
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9274698B2 (en) 2007-10-26 2016-03-01 Blackberry Limited Electronic device and method of controlling same
US9501168B2 (en) 2011-08-10 2016-11-22 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8405627B2 (en) * 2010-12-07 2013-03-26 Sony Mobile Communications Ab Touch input disambiguation
KR20150073354A (ko) * 2013-12-23 2015-07-01 삼성전자주식회사 디스플레이를 통하여 제공되는 오브젝트 처리 방법 및 장치

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6211856B1 (en) * 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6259436B1 (en) * 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US20020122029A1 (en) * 1999-05-20 2002-09-05 Murphy Stephen C. Computer touch screen adapted to facilitate selection of features at edge of screen
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US6073036A (en) * 1997-04-28 2000-06-06 Nokia Mobile Phones Limited Mobile station with touch input having automatic symbol magnification function

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6211856B1 (en) * 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6259436B1 (en) * 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US20020122029A1 (en) * 1999-05-20 2002-09-05 Murphy Stephen C. Computer touch screen adapted to facilitate selection of features at edge of screen
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9274698B2 (en) 2007-10-26 2016-03-01 Blackberry Limited Electronic device and method of controlling same
US10423311B2 (en) 2007-10-26 2019-09-24 Blackberry Limited Text selection using a touch sensitive screen of a handheld mobile communication device
US11029827B2 (en) 2007-10-26 2021-06-08 Blackberry Limited Text selection using a touch sensitive screen of a handheld mobile communication device
KR101472591B1 (ko) * 2008-11-14 2014-12-17 삼성전자주식회사 컨텐츠의 확대 영역 선택 방법, 컨텐츠 제공 장치 및 시스템
US8930848B2 (en) 2008-11-14 2015-01-06 Samsung Electronics Co., Ltd. Method for selecting area of content for enlargement, and apparatus and system for providing content
US9501168B2 (en) 2011-08-10 2016-11-22 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object
US10338739B1 (en) 2011-08-10 2019-07-02 Cypress Semiconductor Corporation Methods and apparatus to detect a presence of a conductive object
US9141280B2 (en) 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
US9383921B2 (en) 2011-11-09 2016-07-05 Blackberry Limited Touch-sensitive display method and apparatus
US9588680B2 (en) 2011-11-09 2017-03-07 Blackberry Limited Touch-sensitive display method and apparatus

Also Published As

Publication number Publication date
WO2007082290A3 (fr) 2008-04-10
GB0600548D0 (en) 2006-02-22
GB2434286A (en) 2007-07-18
GB2434286B (en) 2008-05-28

Similar Documents

Publication Publication Date Title
US10866724B2 (en) Input and output method in touch screen terminal and apparatus therefor
EP2502136B1 (fr) Procédé et appareil permettant de répliquer une fonction de touche physique avec des touches programmables dans un dispositif électronique
US9740321B2 (en) Method for operating application program and mobile electronic device using the same
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
KR101012598B1 (ko) 컴퓨터의 터치 스크린 상에 디스플레이를 생성하기 위한 방법 및 컴퓨터 판독가능 매체
TWI428812B (zh) 操控應用程式的方法、其電子裝置、儲存媒體,及使用此方法之電腦程式產品
US9875005B2 (en) Method of unlocking electronic device by displaying unlocking objects at randomized/user-defined locations and related computer readable medium thereof
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
TW201329835A (zh) 顯示控制裝置、顯示控制方法及電腦程式
JP5556398B2 (ja) 情報処理装置、情報処理方法およびプログラム
US8558806B2 (en) Information processing apparatus, information processing method, and program
US20110148776A1 (en) Overlay Handling
GB2516029A (en) Touchscreen keyboard
WO2007082290A2 (fr) Interface d'utilisateur pour dispositif informatique à base d'écran tactile et procédé correspondant
WO2011152335A1 (fr) Dispositif électronique utilisant une entrée de panneau tactile et son procédé d'être actionné au moyen de celle-ci
JP6217633B2 (ja) 携帯端末装置、携帯端末装置の制御方法、及びプログラム
KR20090056469A (ko) 터치 스크린에서 터치 구동 장치 및 방법
KR101678213B1 (ko) 터치 영역 증감 검출에 의한 사용자 인터페이스 장치 및 그 제어 방법
KR102296968B1 (ko) 즐겨찾기모드 조작방법 및 이를 수행하는 터치 스크린을 포함하는 장치
KR100859882B1 (ko) 터치 기반 사용자 입력 장치상의 듀얼 포인트 사용자입력을 인지하기 위한 방법 및 장치
JP5165624B2 (ja) 情報入力装置、オブジェクト表示方法、およびコンピュータが実行可能なプログラム
US11893229B2 (en) Portable electronic device and one-hand touch operation method thereof
EP2743812B1 (fr) Procédé de sélection de plusieurs entrées sur une interface utilisateur
JP7019992B2 (ja) 表示入力装置およびそれを備えた画像形成装置
USRE46020E1 (en) Method of controlling pointer in mobile terminal having pointing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07710082

Country of ref document: EP

Kind code of ref document: A2