GB2355086A - Selection of displayed options, for self-service terminals - Google Patents
Selection of displayed options, for self-service terminals Download PDFInfo
- Publication number
- GB2355086A GB2355086A GB9923660A GB9923660A GB2355086A GB 2355086 A GB2355086 A GB 2355086A GB 9923660 A GB9923660 A GB 9923660A GB 9923660 A GB9923660 A GB 9923660A GB 2355086 A GB2355086 A GB 2355086A
- Authority
- GB
- United Kingdom
- Prior art keywords
- probe
- screen
- area
- contact
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A self-service terminal comprises a touch-sensitive screen (14) for displaying a plurality of selectable options, and touch sensing means for detecting an area of the screen that is in contact with a probe eg finger (62). The terminal further comprises control means responsive to the touch sensing means for displaying (70) on the screen an enlarged image of at least the area in contact with the probe (62) to assist selection of a selectable option (78,80,82) by a user. A displayed pointer (64) gives the user an indication of the exact area the probe (62) is pointing at.
Description
2355086 8024 SELF-SERVICE TERMINAL The present invention relates to a self
-service terminal (SST). In particular the invention relates to an SST having a touch sensitive screen.
SSTs are public access terminals that provide services for members of the public. SSTs that process cash are referred to as ATMs (automated teller machines); whereas SSTs that do not process cash are referred to as inf ormation kiosks.
SSTs are commonly sited in public areas, such as shopping centres, retail outlets, and such like. Typical services provided by SSTs include dispensing cash and providing users with information. with the Increased use of the Internet, some SSTs are now Web-enabled, that is, they allow a user to browse the Internet.
A common user interface provided by SSTs is a touchscreen. As is well known in the art, a touchscreen enables a user to select areas on the display by pointing at the area using a probe, such as one of their f ingers or a stylus SST owners prefer not to provide a stylus because of the possibility of theft or vandalism of the stylus. Thus, users generally have to make selections usipg one of their fingers.
One problem associated with browsing the Internet at an SST is that Web pages are designed for use at a home or office computer having an accurate pointing device such as a mouse, Web pages generally include small hypertext links and other active areas not designed for touchscreen use. For a novice user of an SST, or for an SST user who is not used to selecting areas on the screen using his finger, it is very difficult to select the small llyl)ertext links accurately. Partially-sighted users also experience problems in selecting small hypertext links.
Another problem associated with using some touchscreens is that there tends to be a dif f erence between a point on the touchscreen and the corresponding point displayed on the screen. This difference is termed the drift. This means that the area on the screen that is pointed at by a user does not correspond exactly to the area sensed by the touchscreen. If two hypertext links are located in close proximity, a user may select one of the links but, because of drift in the touchscreen, the other link may be selected by the touchscreen.
It is among the objects of one or more embodiments of the present invention to obviate or mitigate one or more of the above disadvantages, or of other disadvantages associated with the prior art.
According to a first aspect of the present invention there is provided a self-service terminal comprising a screen for displaying a plurality of selectable optlonse aind touch sensing means for..detectin"g an arei of. the sci'eiieinthat is in contact with a probe, characterised In that the terminal further comprises control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe' to assist selection of a selectable option by a user.
By virtue of this aspect of the invention, a user is able to see a magnified image of exactly what area of the screen the user's finger is touching, thereby enabling the user to move his finger slightly, if necessary, to select the desired selectable option. This avoids the user accidentally selecting an undesired selectable option due to, for example, the size of his finger, the angle of view, or touchscreen drift.
By providing a magnified image of at least the area in contact with the probe, the size of any.hypertext links is effectively increased.
Preferably, the enlarged image includes the area in contact with the probe and the.are4 in the immediate' vicinity of the area,in contact with the probe.
Preferably, the control means provides indication means for indicating what part of the enlarged image is in contact with the probe. The indication means may be in the f orm of a pointer. Alternatively, the indication means may be implemented by the control means highlighting the part of the enlarged image that is in contact with the probe.
This allows a user to modify the position of the probe so that the probe is located exactly where the user desires. This gives the user fine control over the placement of the probe on the screen, thereby minimising the possibility of accidental selection of an undesired option.
The control means may be operative to cease displaying the enlarged image on the screen when the probe is removed from contact with the screen. Alternatively, when the probe is removed from contact with the screen the control means may continue to display the enlarged image until the probe is re-applied to the screen and a new enlarged image is shown.
Preferably, the control means are operative to display the enlarged image on an area of the screen that is not obscured by a user's hand. Alternatively, the control means are operative to display the enlarged image on a fixed area of the screen, so that the enlarged image always appears in the same place.
Preferably, the touch sensing means are operable to select an option on removal of the probe from contact with the screen. Alternatively, an additional contact may be required to select an option.
The enlarged image may be displayed as a graphical window that is configurable by a user so that the user is able to resize the window. The window may also allow the user to select the desired magnification.
This has the advantage that the user can determine the size of the image to suit their preference or requirement.
For example, a partially-sighted person may require a larger enlarged image than a person having perfect eyesight.
The control means may be implemented in software.
The SST may be an ATM.
According to a second aspect of the invention there is provided a method of assisting a user select options at a self-service terminal having a touch sensitive screen, the method comprising the steps of: detecting an area of the screen that is in contact with a probe, and displaying on the screen an enlarged image of at least the area in contact with the probe.
Preferably, the step of displaying on the screen an enlarged image of at least the area in contact with the probe includes displaying on the screen an area in the immediate vicinity of the area in contact with the probe.
Preferably, the method includes the further step of indicating on the enlarged image what part of the image is in contact with the probe.
According to a third aspect of the invention there is provided a touch sensitive screen for displaying a plurality of selectable options, the screen including touch sensing means for detecting an area of the screen that is in contact with a probe, and control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe for assisting selection of a selectable option by a user.
According to a fourth aspect of the invention there is provided a computer program product for use with a computer having a touch sensitive screen responsive to a probe, the product comprising computer program code means, when the program is loaded, for responding to touch sensing means and for displaying on the screen an enlarged image of at least an area in contact with the probe, and for displaying indication means for indicating what part of the enlarged image is in contact with the probe, for assisting selection of a selectable option by a user.
These and other aspects of the invention will become apparent from the following specific description, given by way of example, with reference to the accompanying drawings, in which:
Fig 1 is a simplified block diagram of a self-service terminal according to one embodiment of the present invention; Fig 2 is a simplified block diagram of part of the terminal of Fig 1; Fig 3 is a pictorial representation of a user interacting with the display of Fig 1; and Fig 4 is a flowchart illustrating the steps used by the terminal of Fig 1.
Referring to Fig 1, there is shown a self-service terminal 10 having a user interface 12 comprising a screen 14 for displaying a plurality of selectable options (such as hypertext links), touch sensing means 16 in the form of a touch panel aligned. with and located adjacent to the screen 14, a card receiving slot 18 and a printer slot 20.
The screen 14 has an associated display driver 22, the touch panel 16 has a touch panel driver 24, the card receiving slot 18 has an associated motorised card reader module 26, and the printer slot 20 has an associated printer module 30. All of the drivers 22, 24 and the modules 26, 30 are connected to a terminal controller 32 that controls the operation of the terminal 10. The terminal controller 32 is also connected to a network connection 34 for communicating with an IP (Internet Protocol) network 36, such as the Internet, an Intranet, or an Extranet.
Referring to Fig 2, which is a block diagram of the terminal of Fig I excluding the user interface 12, the terminal controller 32 includes a processor 40, associated memory 42, and storage space 44 in the form of a hard disk drive. The hard disk 44 stores the operating system for the terminal 10, the application program that controls the terminal 10, and control means in the form of a zoom-in program. The zoom-in program may be based on a conventional zoom-in tool such as ZoomIn version 3.1 available from Microsoft (trade mark) Corporation. The zoom-in program is responsive to the touch panel driver 24 for displaying on the screen 14 an enlarged image of the area in contact with a probe.
Referring to Figs 2, 3 and 4, on power-up of the SST 10, the operating system kernel 50 (Fig 2), the application program 52 (Fig 2) for controlling the terminal, and the zoom-in program 54 (Fig 2) are loaded into memory 42 (step 100 of Fig 4).
When a user 60 places a probe 62, such as his finger, in contact with the touch panel 16, the touch panel 16 senses this (step 102) and the panel driver 24 conveys coordinate data to the controller 32 for enabling the controller 32 to determine what pixel of the screen 14 has been touched by the user 60. The co-ordinate data is generally a single Cartesian (x, y) co-ordinate representing the pixel touched by the finger 62. As is known in the art the touch panel driver 24 performs calculations to determine from the plurality of pixels touched by the finger 62 what the centre pixel is; the co-ordinates of this centre pixel are sent to the controller 32.
The zoom-in program 54 uses this single co-ordinate to instruct the display driver 22 to display a magnification window 70 (step 104) having:
(1) an enlarged image of the area of the screen in the vicinity of the pixel represented by the single co-ordinate, and (2) indication means 64, in the form of a pointer shaped like an arrow, pointing at the pixel represented by the single co-ordinate.
It will be appreciated that a conventional zoom-in application such as ZoomIn version 3.1 can be easily modified to include a pointer that points at the pixel represented by the single co-ordinate. The magnification window 70 overlies part of the main window 72 and is located so that it is not obscured by the user's hand.
The zoom-in program 54 instructs the display driver to display all of the pixels within a certain (x,y) distance of the pixel represented by the single co-ordinate so that a magnified image in the vicinity of the single co-ordinate is displayed on window 70.
Thus, the zoom-in program 54 enlarges the area around the f inger 62 and adds a pointer 64 that points to the pixel represented by the single coordinate, it does not af f ect the size of any touch zones within the touch panel 16; thus, the operation of the touch panel 16 is unaffected by the zoom-in program 54. The pointer 64 gives the user 60 an indication of the exact area of the screen 14 he is pointing at.
As f or conventional "windows" on screens, the magnification window 70 can be moved by the user to any desired position on the screen 14. As the technology used f or creating and manipulating graphical windows on a display is well known in the art it will not be described in detail herein.
As shown in Fig 3, three small hypertext links 78,80,82 are displayed on screen 14. Initially, the user 60 places his finger 62 near to the desired hypertext link 80. When the user 60 touches the screen 14, the magnification window 70 is opened (step 104) and the user 60 is able to move his finger 62 to guide the pointer 64 to the desired hypertext link 80. The touch panel 16 continually monitors the position of the finger 62 to detect removal (step 106) or movement (step 108) of the finger 62.
If the pointer is moved then the touch panel driver 24 sends the new single co-ordinate and the zoom-in program 54 uses this co-ordinate to instruct the display driver 22 to update (step 110) the contents of the magnification window 70. When the user has moved his f inger 62 so that the pointer 64 is located directly above the desired hypertext link 80, the user 60 then removes his finger 62 from the touch panel 16 and the hypertext link 80 is selected (step 112) by the touch panel driver 24. The magnification window 70 is then closed by the zoom-in program 54 until the touch panel 16 is touched again (step 102).
It will be appreciated that the above embodiment has the advantage that any drift in the touch panel 16 is corrected because the user 60 can identify exactly what point on the screen 14 the touch panel 16 is sensing. The above embodiment also has the advantage that a user 60 is provided with a magnified view of what point he is touching.
It will now be appreciated that conventional webenabled SSTs can be updated by adding a software module (the control means) to provide an SST according to one embodiment of the invention; this enables SSTs to be upgraded in the field, that is, existing SSTs may be retro-fitted with the control means to provide an embodiment of the invention.
Various modifications may be made to the above described embodiment within the scope of the invention, for example, the screen 14 and touch panel 16 may be incorporated into a single integral touchscreen unit. In other embodiments the probe may be a stylus. In other embodiments, the control means may be implemented in hardware or firmware.
Claims (10)
1. A self-service terminal (10) comprising a screen (14) for displaying a plurality of selectable options, and touch sensing means (16) for detecting an area of the screen (14) that is in contact with a probe (62), characterised in that the terminal (10) further comprises control means (52) responsive to the touch sensing means (16) for displaying on the screen an enlarged image of at least the area in contact with the probe (62) to assist selection of a selectable option (78,80,82) by a user.
2. A terminal according to claim 1, wherein the enlarged image includes the area in contact with the probe (62) and the area in the immediate vicinity of the area in contact with the probe (62).
3. A terminal according to claim 1 or 2, wherein the control means (52) provides indication means (64) for indicating what part of the enlarged image is in contact with the probe (62).
4. A terminal according to any preceding claim, wherein the control means (52) are operative to display the enlarged image on an area of the screen that is not obscured by a user's hand.
5. A terminal according to any preceding claim, wherein the enlarged image is displayed as a graphical window that is conf igurable by a user so that the user is able to resize the window and to select a desired magnification.
6. A method of assisting a user select options at a self-service terminal having a touch sensitive screen, the method comprising the steps of: detecting (step 102) an area of the screen that is in contact with a probe, and displaying (step 104) on the screen an enlarged image of at least the area in contact with the probe.
7. The method of claim 6, wherein the step of displaying on the screen an enlarged image of at least the area in contact with the probe includes displaying on the screen an area in the immediate vicinity of the area in contact with the probe.
8. The method of claim 6 or 7, wherein the method includes the further step of indicating on the enlarged image what part of the image is in contact with the probe.
9'. A touch sensitive screen for displaying a plurality of selectable options, the screen including touch sensing means (16) for detecting an area of the screen that is in contact with a probe, and control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe for assisting selection of a selectable option by a user.
10. A computer program product (52) for use with a computer (32) having a touch sensitive screen (14,16) responsive to a probe (62), the product (52) comprising computer program code means, when the program is loaded, for responding to touch sensing means (16) and for displaying on the screen (14) an enlarged image of at least an area in contact with the probe (62), and for displaying indication means (64) for indicating what part of the enlarged image is in contact with the probe (62), for assisting selection of a selectable option by a user.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9923660A GB2355086A (en) | 1999-10-06 | 1999-10-06 | Selection of displayed options, for self-service terminals |
PCT/GB2000/001997 WO2000075766A1 (en) | 1999-06-02 | 2000-05-24 | Self-service terminal |
AU50878/00A AU5087800A (en) | 1999-06-02 | 2000-05-24 | Self-service terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9923660A GB2355086A (en) | 1999-10-06 | 1999-10-06 | Selection of displayed options, for self-service terminals |
Publications (2)
Publication Number | Publication Date |
---|---|
GB9923660D0 GB9923660D0 (en) | 1999-12-08 |
GB2355086A true GB2355086A (en) | 2001-04-11 |
Family
ID=10862244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9923660A Withdrawn GB2355086A (en) | 1999-06-02 | 1999-10-06 | Selection of displayed options, for self-service terminals |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2355086A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1881396A2 (en) * | 2006-05-23 | 2008-01-23 | LG Electronics Inc. | Controlling pointer movements on a touch sensitive screen of a mobile terminal |
EP2071440A1 (en) * | 2007-12-14 | 2009-06-17 | Siemens Aktiengesellschaft | Dynamic repositioning of a drop down page/window |
EP2234008A1 (en) * | 2009-03-23 | 2010-09-29 | Square Enix Co., Ltd. | Portable game machine with touch panel display |
CN101661346B (en) * | 2008-08-29 | 2012-06-06 | 佛山市顺德区顺达电脑厂有限公司 | Intelligent touch-control device and method |
CN103150054A (en) * | 2013-03-08 | 2013-06-12 | 山西大学 | Touch screen equipment and data input method for combining touch with mouse |
EP3040839A1 (en) * | 2014-12-31 | 2016-07-06 | Dassault Systèmes | Selection of a graphical element |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0476972A2 (en) * | 1990-09-17 | 1992-03-25 | Xerox Corporation | Touch screen user interface with expanding touch locations for a reprographic machine |
WO1998052118A1 (en) * | 1997-05-15 | 1998-11-19 | Sony Electronics, Inc. | Display of menu items on a computer screen |
-
1999
- 1999-10-06 GB GB9923660A patent/GB2355086A/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0476972A2 (en) * | 1990-09-17 | 1992-03-25 | Xerox Corporation | Touch screen user interface with expanding touch locations for a reprographic machine |
WO1998052118A1 (en) * | 1997-05-15 | 1998-11-19 | Sony Electronics, Inc. | Display of menu items on a computer screen |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1881396A2 (en) * | 2006-05-23 | 2008-01-23 | LG Electronics Inc. | Controlling pointer movements on a touch sensitive screen of a mobile terminal |
EP1881396A3 (en) * | 2006-05-23 | 2009-02-11 | LG Electronics Inc. | Controlling pointer movements on a touch sensitive screen of a mobile terminal |
US8274482B2 (en) | 2006-05-23 | 2012-09-25 | Lg Electronics Inc. | Controlling pointer movements on a touch sensitive screen of a mobile terminal |
EP2071440A1 (en) * | 2007-12-14 | 2009-06-17 | Siemens Aktiengesellschaft | Dynamic repositioning of a drop down page/window |
CN101661346B (en) * | 2008-08-29 | 2012-06-06 | 佛山市顺德区顺达电脑厂有限公司 | Intelligent touch-control device and method |
EP2234008A1 (en) * | 2009-03-23 | 2010-09-29 | Square Enix Co., Ltd. | Portable game machine with touch panel display |
EP2293178A3 (en) * | 2009-03-23 | 2013-01-16 | Square Enix Co., Ltd. | Portable game machine with touch panel display |
CN103150054A (en) * | 2013-03-08 | 2013-06-12 | 山西大学 | Touch screen equipment and data input method for combining touch with mouse |
CN103150054B (en) * | 2013-03-08 | 2016-06-01 | 山西大学 | The data entry device that a kind of touch panel device and touch combine with mouse |
EP3040839A1 (en) * | 2014-12-31 | 2016-07-06 | Dassault Systèmes | Selection of a graphical element |
US11061502B2 (en) | 2014-12-31 | 2021-07-13 | Dassault Systemes | Selection of a graphical element with a cursor in a magnification window |
Also Published As
Publication number | Publication date |
---|---|
GB9923660D0 (en) | 1999-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000075766A1 (en) | Self-service terminal | |
US11481105B2 (en) | Remote interfacing with a networked dialysis system | |
US6727892B1 (en) | Method of facilitating the selection of features at edges of computer touch screens | |
US7245293B2 (en) | Display unit with touch panel and information processing method | |
US6961912B2 (en) | Feedback mechanism for use with visual selection methods | |
US10055046B2 (en) | Touch-sensitive electronic apparatus for media applications, and methods therefor | |
US8797363B2 (en) | Method of controlling touch panel display device and touch panel display device using the same | |
US7304638B2 (en) | Computer touch screen adapted to facilitate selection of features at edge of screen | |
EP2657811B1 (en) | Touch input processing device, information processing device, and touch input control method | |
US10282081B2 (en) | Input and output method in touch screen terminal and apparatus therefor | |
US20020080123A1 (en) | Method for touchscreen data input | |
EP1803056A2 (en) | Methods and systems for converting touchscreen events into application formatted data | |
EP1621989A2 (en) | Touch-sensitive electronic apparatus for media applications, and methods therefor | |
WO2006013485A2 (en) | Pressure-controlled navigating in a touch screen | |
JP2008516335A5 (en) | ||
JP3319647B2 (en) | Character input device | |
WO2003012618A2 (en) | Sensor-based menu for a touch screen panel | |
US20100328232A1 (en) | Touch Screen Cursor Presentation Preview Window | |
WO2011022014A1 (en) | Configuration of additional display devices | |
JP2002328040A (en) | Navigation system, information displaying device, method for changing scale of image, recording medium and program | |
JP2000137564A (en) | Picture operating device and its method | |
GB2355086A (en) | Selection of displayed options, for self-service terminals | |
JP2995719B2 (en) | Pen input / menu display device | |
US20070094614A1 (en) | Data processing device | |
JPH11175212A (en) | Touch operation processing method for touch panel device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |