GB2470418A - Haptic information delivery - Google Patents

Haptic information delivery Download PDF

Info

Publication number
GB2470418A
GB2470418A GB0908892A GB0908892A GB2470418A GB 2470418 A GB2470418 A GB 2470418A GB 0908892 A GB0908892 A GB 0908892A GB 0908892 A GB0908892 A GB 0908892A GB 2470418 A GB2470418 A GB 2470418A
Authority
GB
United Kingdom
Prior art keywords
haptic
user
electronic device
data items
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0908892A
Other versions
GB0908892D0 (en
Inventor
Preetam Heeramun
Geoffrey Fisher
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to GB0908892A priority Critical patent/GB2470418A/en
Publication of GB0908892D0 publication Critical patent/GB0908892D0/en
Publication of GB2470418A publication Critical patent/GB2470418A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

An electronic device e.g. a mobile telephone 10 has a processor, an interface screen 17, detecting means that detect when the interface screen has been touched and haptic delivery means e.g. piezo electric crystals for delivering information to a user of the device by touch. The device may have different modes in which information is provided for a user either optically or haptically. In such cases, the different modes may set out the information differently, and/or the manner of navigating amongst the information may be different. In the visual mode, the amount of information that is navigated between in response to a user provided input depends upon the input provided, whereas in the haptic mode, the amount of information that is navigated between in response to a user provided navigation input is constant irrespective of the navigation input provided and may be a subset of the information provided in the visual mode. For example during scrolling in the haptic mode the number of data items e.g. mobile telephone's address book items that can be scrolled through does not depend on the scrolling instruction.

Description

Information delivery method and apparatus The present invention relates to the delivery of information. The invention has particular, although not exclusive relevance to the delivery of haptic information to the user of a portable electronic device, for example to the user of a mobile telephone.
Electronic devices such a mobile telephones typically deliver information to their users by way of a viewable screen and possibly also by way of one or more speakers or earphones. However, in a number of circumstances users are unable to use a viewable screen. For example, partially sighted or blind persons may not be able to see such a screen. Also, those involved in surveillance or military operations as well as astronomers and some surgeons can work in conditions in which the presence of an illuminated screen would be highly disadvantageous. Similarly, if light levels are particularly high or variable (such as under stroboscopic lighting or under flashing emergency service lights), it can be difficult or impossible to view such a screen. Also, the use of speakers or earphones for information delivery is also not always practicable as the device may be being employed in an environment having high levels of ambient noise or the user may need to be able to hear other noises -such as warning bells or sirens. Furthermore, users of electronic devices may wish to interact with their devices in a discreet manner, for example by interacting with a device in their pocket, so as to avoid drawing attention to themselves -as may be desirable when the user is in a meeting or in an area having a high level of street crime.
According to one aspect, the present invention provides an electronic device, such as a mobile telephone, having a processor and a user interface by which users can interact with the electronic device. The user interface has an interface surface, a haptic generator (such as an array of piezo electric crystals) that can deliver different haptic stimuli to different areas of the interface surface, and a detector that can detect when and where the interface surface has been activated, for example by being touched by a pointer such as a finger. The processor can output a signal that makes the haptic generator deliver a localised haptic response in the area of the interface screen that has been activated. Such an electronic device allows a user, by touch only, to differentiate between different areas of the interface surface and hence to determine which area they would like to activate.
In one embodiment, the processor can monitor whether an activation action, such as a user double tapping the interface surface, has been performed. Once the processor has determined that an activation action has been performed, the processor then executes a function associated with the area of the interface surface that has been touched. For example, an activation action performed in an area associated with an email application may cause the processor to run the email application.
Once an activation action has been performed, the processor may then reconfigure the user interface so that, if the same area were to be touched again, the haptic response that would be delivered would be different from that before reconfiguration.
For example, if the interface is configured to represent a user navigable menu with the interface surface divided into a 3x2 matrix so that each matrix element is associated with a different menu option and a different haptic response, after selection of one of the menu options, the processor could reconfigure the interface surface to represent a sub-menu in which the interface surface is divided into a 3x1 matrix with each matrix element being associated with a different sub-menu option and in which the different sub-menu options are associated with haptic responses that differ from those of the main menu.
In one embodiment, the electronic device further comprises a display, such as a Liquid Crystal Display (LCD), for optically representing data items on the interface screen. Advantageously, this enables the device to be used to provide information to a user either visually, hapitically, or both. For example, if a given area of the interface surface is associated with a haptic response indicative of a phonebook application, the display may present a telephone icon in that area so that a user could determine which area of the screen to touch to open the phonebook application either by sight, or by touch, or by a combination of the two.
In one embodiment, the processor receives information from the detector to indicate where the interface screen has been touched. From this information, the processor determines an active area that has been touched and a haptic response associated with that active area before instructing the haptic generator to provide the determined response at the area of the interface screen that has been touched.
According to another aspect, the present invention provides an electronic device having a processor and an interface screen by which users can interact with the device. The interface screen has a display that can visually present information on the screen, a haptic generator that can haptically present information in response to a user interacting with the device, and a navigation input means by which users can control the information presented by the device. The device has two modes, a haptic mode in which information is haptically delivered to users, and a non-haptic mode in which information is visually delivered to users. When in the visual mode, the amount of information that is navigated between in response to a user provided input depends upon the input provided, whereas when in the haptic mode, the amount of information that is navigated between in response to a user provided navigation input is constant irrespective of the navigation input provided.
According to another aspect, the present invention provides an electronic device having a processor and an interface screen by which users can interact with the device. The interface screen has a display that can visually represent information on the screen, and a haptic generator that can deliver haptic responses to a user interacting with the device. The device has two modes, a non-haptic mode in which information is visually delivered to users, and a haptic mode in which a subset of said information is haptically delivered to users.
As those skilled in the art will appreciate, the above aspects can be implemented separately or in any combination in an electronic device. These and various other aspects of the invention will become apparent from the following detailed description of embodiments which are given by way of example only and which are described with reference to the accompanying Figures in which: Figure 1 shows a schematic representation of a mobile telephone having active areas associated with a plurality of icons; Figure 2 is a block diagram illustrating the main components of the mobile telephone shown in Figure 1; Figure 3 is a flow chart illustrating one manner in which the mobile telephone shown in Figure 1 may be operated by a user touching its screen; Figure 4 is a schematic representation of the elements that comprise the screen of the mobile telephone of Figure 1 and their grouping; Figure 5, like Figure 4, is a schematic representation of the elements that comprise the screen of the mobile telephone of Figure 1 however with a different grouping to that of Figure 4; Figure 6 is a schematic representation of the screen of the mobile telephone of Figure 1 when the information is arranged for delivery by sight; Figures 7, 8, and 9 are alternative schematic representations of the screen of the mobile telephone of Figure 1 when the information of Figure 6 is arranged for haptic delivery; Figure 10 shows the screen of Figure 8 following a downward navigation by a user; and Figure 11 shows a schematic representation of the mobile telephone of Figure 1, but having active areas that outline the icons.
Overview Figure 1 shows schematically a mobile telephone 10 having an on/off switch 12 for turning the mobile telephone 10 on and off, a microphone 14 for converting sound waves presented to the microphone 14 into electrical signals, a loudspeaker 16 for converting electrical signals into sound waves, and user interface 13 comprising a display screen 17 having touch screen capabilities and an array of haptic elements (not shown), for example piezo electric crystals similar to those employed in the Nokia 770 Internet Tablet. The display screen 17 comprises an array of optical elements (not shown), for example LCD pixels, whose optical properties are changeable in response to supplied electrical signals so that the display screen can be used to present viewable information to users in a conventional manner. In this embodiment, the display screen's touch screen capabilities are provided by an array of detector elements (not shown), for example capacitive detectors, that are operable to detect the presence or proximity of a user provided pointer, such as a user's finger or a stylus tip. The display screen's haptic elements are operable to provide a mechanical feedback stimulus that the user can feel and that depends on the drive signal applied to the element.
In use, the display screen 17 displays a number of icons (or graphical representations) 18a-18f, each icon 18a-18f being associated with a different application or feature of the mobile telephone 10. As an example, the mobile telephone 10 may have the following icons: an address book icon 18a, an email icon 18b, a check the weather icon 18c, a clock icon 18d, a call register icon 18e and a camera icon 1 8f. Each icon 1 8a to 1 8f is associated with an active area 20a-20f on the screen. When the mobile telephone 10 is in a first mode (a FIND MODE), if a user touches the display screen 17, for example with their finger 22, the detector elements detect that contact and the mobile telephone 10 determines the position on the display screen 17 at which contact was made. The mobile telephone 10 then determines in which active area 20a-20f of the display screen 17 contact was made and the corresponding icon 18a-18f associated with that active area 20a-20f. A haptic drive signal associated with the determined icon 18a to 18f is then generated and sent to the haptic elements that lie within the determined active area 20 to cause them to provide a desired haptic response. For example, if when in the FIND MODE a user's finger 22 contacts the display screen 17 in the active area 20e associated with the call register icon 18e, a haptic signal associated with the call register icon 18e is generated and provided to the haptic elements underlying the active area 20e associated with the call register icon 18e. Different haptic drive signals are generated for the different displayed icons 18, so that different haptic responses will be felt by the user depending on which icon 18 the user touches. In this way, a user may, by touch only, differentiate between displayed icons and hence determine which icon they would like to activate.
Once a user has determined which icon they would like to activate, they then perform an ACTIVATION ACTION to instruct the mobile telephone 10 to activate the functionality associated with that icon. Example ACTIVATION ACTIONs include: the user holding their finger 22 over the relevant active area 20a-20f for more than a predetermined time period; the user tapping their finger 22 in the relevant active area 20a-20f twice within a predetermined time period; the user moving their finger 22 away from the relevant active area 20a-20f and then back within a short time period; and the user moving their finger 22 away from the relevant active area 20a- 20f and then back twice within a short time period -the latter ACTIVATION ACTION may be effected in order to differentiate between a short press and a long press on an icon 18a-18f. The time periods for the above example ACTIVATION ACTIONs may be user programmable.
Once the user has found the icon that they wish to activate using the FIND MODE and then activated that icon by way of an ACTIVATION ACTION, the mobile telephone 10 enters a second mode (an EXECUTE MODE), in which it executes the action required by activating the feature associated with the icon, before returning to the FIND MODE. For example, if a user activates an address book icon 18a, the mobile telephone may generate a list of contacts and display this on the display screen 17.
Mobile Telephone Figure 2 schematically illustrates the main components of the mobile telephone 10 shown in Figure 1. As shown, the mobile telephone 10 includes a transceiver circuit 24 which is operable to transmit signals to and to receive signals from a communications network (not shown) via one or more antennae 26. As shown, the mobile telephone 10 also includes a controller 28 which controls the operation of the mobile telephone 10 and which is connected to the transceiver circuit 24 and to a loudspeaker 16, a microphone 14, an on/off switch 12, and a user interface 13 comprising optical elements 30, haptic elements 32, and detector elements 34. The controller 28 operates in accordance with software instructions stored within memory 36. As shown, these software instructions include, among other things, an operating system 39 having a find and execute module 40 to implement the above described FIND MODE and EXECUTE MODE, an active area defining module 42 for defining new active area configurations for the display screen 17, and a navigation module 43 for controlling navigation (e.g. scrolling operations). The software instructions also include an optical element interface 44, a detector element interface 46, and a haptic element interface 48, that respectively instruct the controller 28 how to interface with the optical elements 30, the detector elements 34 and the haptic elements 32. Memory 36 further stores a number of software applications 50 (such as an address book application, email application, weather application, clock application, etc.) that are executable by the controller 28, as well as lookup tables 52 for storing the haptic control signals and information associating the optical elements 30, the haptic elements 32, the detector elements 34, and/or the active areas 20a- 20f, and a recent action database 54 for storing details of recent user input.
Find and execute modes Figure 3 is a flow chart illustrating the stages through which the mobile telephone 10 progresses when in the FIND MODE or the EXECUTE MODE. At Si, the controller 28 defines the active areas 20a-20f for the current user interface to be displayed.
Figure 4 shows a grid representing the haptic elements 32 of the user interface 13, and in which each haptic element is identifiable by a grid reference. For example, the haptic element 60 marked with a dot has grid reference (d,5), whereas the haptic element 62 marked with a dot and a cross has grid reference (i,4). Each active area 20a-20f is defined with respect to areas of the grid and the haptic elements 32 that lie in each area are grouped together and associated with that active area; this association is then stored in the look up tables 52. Each active area 20 is also associated with an action, for example a phonebook application, and this association is also stored in the look up tables 52. Each action is associated with a haptic response (defined by a haptic control signal) and this association (and the haptic control signal) is also stored in the look up tables 52. To enable the user to also view the active areas 20a-20f, the controller instructs optical elements 30 associated with each active area 20a-20f to display an appropriate icon 18a-18f.
In this embodiment, not all of the haptic elements 32 are in defined active areas 20a- 20f. Advantageously, by having dead zones in between active areas, users with large fingers are less likely to find themselves simultaneously receiving haptic stimuli from adjacent active areas 20a-20f.
Once the active areas for the current display screen have been defined, at step S2, the mobile telephone 10 enters the FIND MODE and the controller 28 monitors for any signal from a detector element 34. If no signal is received, then the user has not touched the user interface 13 and the controller 28 continues to monitor for a signal from a detector element 34. If a signal is received then the controller 28 determines which detector element 34 the signal was received from. The controller 28 then accesses the look up tables 52 to determine if the determined detector element 34 is associated with an active area 20a-20f. If the determined detector element 34 is not associated with an active area 20a-20f, then the controller 28 returns to S2.
Otherwise, the controller progresses to step S3. At step S3, the controller 28 checks the recent action database 54 to see whether the determined detector element 34 has been activated recently. If the determined detector element 34 has been activated recently, then the controller checks whether the previous activation in combination with the present activation indicate that an ACTIVATION ACTION has been performed.
If no ACTIVATION ACTION has been performed then the controller 28 stays in the FIND MODE and progresses to step S4 in which it accesses the look up tables 52 to determine in which active area 20a-20f the determined detector element 34 is located.
At step S5, the controller 28 determines the haptic response that is associated with the determined active area 20a-20f by retrieving the haptic control signals associated with the determined active area 20 from the lookup tables 52.
At step S6, the controller 28 applies the appropriate haptic drive signal to the haptic elements 32 associated with the desired active area 20a-20f to provide the desired haptic response to the user and records this in the recent action database 54 before returning to step S2.
However, if at step S3 it is determined that an ACTIVATION ACTION has been performed, then the controller 28 enters the EXECUTE MODE and progresses to stepS7.
At step S7, the controller 28 determines which detector element 34 the ACTIVATION ACTION was performed at. The controller 28 then accesses the look up tables 52 to determine which active area 20a-20f that the determined detector element 34 is associated with and the action that is associated with that active area 20a-20f, for example the starting of the phonebook application. The controller 28 then performs the determined action at S8.
At step S9, the controller 28 check whether the active areas 20a-20f need to be redefined -as may be the case when a user is navigating between different menus.
If the active areas 20a-20f do need to be redefined, then the active areas are defined as per Si, but with different associations. For example, Figure 5 shows the display screen 17 and grid of Figure 4 having, after redefinition, three active areas 64a-c. As can be seen, although the elements (d,5) and (i,4) 60, 62 were associated -10-with different active areas 20a, 20b in Figure 4, in Figure 5, they are both associated with the same active area 64a.
Active area reconfiguration Figure 6 shows the display screen 17 listing a number of data entries 66a-66u corresponding to entries in a user's address book. In this configuration, although the entries are clearly visible, the density of the displayed data entries 66a-66u is too great for a user to be able to differentiate between the different entries using haptics alone -because (as shown in Figure 6) the user's finger 22 is likely to cover several data entries 66r-66u. Accordingly, when mobile phone 10 is switched from a normal mode in which haptic responses are not provided to a haptic mode in which haptic responses are provided in the manner described above, then the displayed data entries 66a-66u are preferably reconfigured to have a lower density than that of the normal (non-haptic) mode. This allows users to differentiate more easily between the displayed data entries 66a-66d by touch alone. Figure 7 shows the display screen 17 after such a reconfiguration. In this case, instead of twenty one data entries 66a-66u being present, only active areas 68a-68d corresponding to the first four data entries 66a-66d of the non-haptic mode are present and the area occupied by each active area 68a-68d is much greater than the area covered by the corresponding data entry 66a-66d in the non-haptic mode. In this embodiment, each active area 68a-68d is associated with a different haptic response so that a user can differentiate the data entries 66a-66d by touch alone. In this embodiment, and as shown in Figure 7, as well having active areas 68a-68d corresponding to data entries 66a-66d, the display screen 17 also displays these data entries 66a-66d and associated photos 67a-67d so that users can use both vision and touch to interact with the mobile telephone 10. Of course, the display screen 17 does not have to display four entries per page as shown in Figure 7; for example, the display screen 17 could display two entries or just one entry as illustrated respectively in Figures 8 and 9.
Navigation with Haptics When users navigate through visually presented data in a non-haptic mode, for -11 -example by scrolling through a list of entries in a mobile telephone's address book, the number of entries that the user navigates past may be determined by the magnitude of the navigation action that the user provides. For example, if a user quickly moves their finger upwards (or downwards) on the screen, then the number of entries that are navigated past may be greater than if the user were to move their finger slowly upwards (or downwards). As the user is able to look at the screen, they can see how many entries have been navigated past. However, when an electronic device is in a haptic mode, the user is unlikely to be able to see how many entries have been navigated past.
Accordingly, when an electronic device, for example the mobile telephone 10, switches from a visual mode in which data entries are identifiable and selectable by sight to a haptic mode in which data entries are individually distinguishable by touch (for example by having different haptic responses), the navigation mode switches from a visual navigation mode in which the amount of data that is navigated past depends upon the navigation actions supplied by a user, to a haptic navigation mode in which the amount of data that is navigated past for each navigation action is constant irrespective of the navigation action. This allows a user who cannot see the screen to reliably navigate through the data items. As one possibility, when in a haptic mode, for each downward scrolling action, a single data entry is removed from the top of a displayed list, the remaining entries are each moved one position up the list, and a new entry is added at the bottom of the displayed list. For example, if a downward scrolling action 72 is applied to the screen 17 of Figure 8, then the entry for Suzanne is removed from the top of the displayed list, the entry for Tony is moved up, and a new entry for Carlos is added to the bottom of the list so that the screen is as shown in Figure 10. As another possibility, for each navigation action, all presently listed entries are removed and replaced by a corresponding number of entries from an adjacent position in the list.
Modifications and alternatives A number of detailed embodiments have been described above. As those skilled in the art will appreciate, a number of modifications and alternatives can be made to -12 -the above embodiments whilst still benefiting from the inventions embodied therein.
By way of illustration only a number of these alternatives and modifications will now be described.
In one embodiment, instead or as well as providing a haptic response to a user when in the haptic mode and the user touches an active area, an electronic device may have a HAPTIC OUTLINE mode in which the active areas form outline zones on the display screen 17 about displayed icons 18 so that if a user touches a zone on the screen around the edge of a displayed icon 18, they receive a haptic response associated with the displayed icon 18. Figure 11 shows the mobile telephone 10 of Figure 1, but this time with outline zones 70a-70f about each active area 20a-20f.
As one possibility, a displayed icon 18 may have different haptic responses associated with its main body and its outline.
Although the above has described display screens having gaps (or dead zones) in between adjacent active areas, as another possibility, the active areas may be contiguous and have no dead zones.
Although the above has described using a look up table for associating the optical, detector, and haptic elements with active areas, applications and haptic responses, as another possibility, the association may be parametric, for example the x,y coordinates of an element may be used in conjunction with an equation, such as x2 + y2 < 1, to determine whether a given element lies within a given active area.
Although the above description has only referred to the delivery of a single haptic response at any given time, if the display screen 17 is simultaneously contacted at a number of positions each corresponding to a different active area, then haptic responses may be simultaneously provided by the haptic elements associated with the different active areas.
Although the above has described a display screen 17 having optical elements 30, a person skilled in the art will appreciate that, in some embodiments, the optical elements may be omitted. -13-
A person skilled in the art will appreciate that, although an electronic device may be switchable between a haptic mode and a non haptic mode, the electronic device may be configured so as to normally be in a haptic mode. If an electronic device is switchable between a visual mode and a haptic mode, then upon switching to the haptic mode, the optical elements 30 and/or any backlight associated therewith may be disabled. For example, in the haptic mode, instead of activation of a key causing a backlightto illuminate the display screen 17, the backlight may be disabled.
Although the above description refers to capacitive detector elements 34, a person skilled in the art would appreciate that other types of touch screen technologies could alternatively or additionally be employed, for example, inductive, resistive, surface acoustic wave, infra-red, strain gauge, optical imaging, acoustic pulse recognition, and/or dispersive signal technologies.
A person skilled in the art will appreciate that, as an alternative to piezo electric crystals, haptic elements 32 could comprise other elements capable of delivering a haptic response, for example resistive elements operable to deliver a thermal response, or vents operable to deliver a current of air to the user's finger. When the haptic elements 32 are operable to vibrate, they may vibrate at a first frequency and with a first magnitude in order to provide a first stimulus and at a second frequency and with a second magnitude to provide a second stimulus. As one possibility, upon determination that a haptic response is to be delivered, the haptic elements 32 may be activated for a predetermined time period; alternatively the haptic elements 32 may be activated (for example they may vibrate) continuously as long as a pointer remains in contact with the relevant active area. Furthermore, the haptic elements 32 may be activated intermittently in a pattern of different activations, for example the haptic elements 32 may be activated repeatedly as per the letter s' in Morse code (i.e. in a dit-dit-dit fashion) to provide a first stimulus and activated repeatedly as per the letter o' in Morse code (i.e. in a dah-dah-dah fashion) to provide a second stimulus.
A person skilled in the art will appreciate that an electronic device having the above -14 -described functionality, may be user programmable so that a user can program the electronic device to associate different haptic responses with different active areas or with data entries (i.e. an individual in a telephone book). For example, if the display screen 17 displays the numbers 0 to 9 as per a conventional mobile telephone keypad, the user may program the haptic response associated with key 1 to vibrate repeatedly as per the letter s' in Morse code and program the haptic response associated with key 2 to vibrate repeatedly as per the letter o' in Morse code. Although it is preferable that each active area is associated with a unique haptic response, some active areas may be associated with the same haptic response -in such cases different active areas being associated with the same haptic response may be differentiable by virtue of contextual data, for example their relative position. Additional keys, such as a # key, may also be programmed to have haptic responses associated with them. The device may also be programmable so that a user can configure the arrangement and/or density of active areas and/or data entries on the display screen 17. Furthermore, not every active area need be associated with a haptic response. For example, if the display screen displays the numbers 0 to 9 as per a conventional mobile telephone keypad so that the number 5 is surrounded by other numbers on all sides, then the active area associated with the number 5 may have no haptic response associated therewith. If the electronic device has a HAPTIC OUTLINE mode, then this mode may be capable of being switched on or off in addition or as an alternative to another haptic mode and mat be programmable so that only some active areas have associated outline zones. If the electronic device has a scrolling facility for scrolling through data entries, then the number of entries displayed per page and the number of entries that are navigated past when in a haptic mode may be user programmable.
A person skilled in the art will appreciate that, although the above has used the mobile telephone 10 as an example of an electronic device, the inventions described herein could equally be applied to other electronic devices including Personal Data Assistants (PDA5), computers, web browsing devices etc..
A person skilled in the art will appreciate that any of the above embodiments may be -15-employed either alone or in combination.
In the above embodiments, a number of software modules, interfaces and applications were described. As those skilled will appreciate, the software modules, interfaces and applications may be provided in compiled or un-compiled form and may be supplied to the electronic device as a signal over a computer network or on a recording medium. Further, the functionality performed by part or all of this software may be performed instead using one or more dedicated hardware circuits. -16-

Claims (20)

  1. Claims 1. An electronic device comprising: a processor; and a user interface for allowing a user to interact with the electronic device, the user interface comprising: an interface surface; a haptic generator operable to generate different haptic responses in different areas of the interface surface; and a detector operable to detect when a pointer touches the interface surface and to indicate the area of the interface surface that the pointer has touched; wherein the processor is operable to output a control signal to cause the haptic generator to generate a localised haptic response in the area touched by the pointer.
  2. 2. The electronic device of claim 1, wherein the processor is further operable to: determine whether an activation action has been performed by the pointer touching the interface screen; and upon determination that an activation action has been performed, execute a function associated with the area touched by the pointer.
  3. 3. The electronic device of claim 2, wherein the processor is further operable, upon execution of the function, to reconfigure the user interface so that after reconfiguration a pointer touching an area of the interface surface would receive a different haptic response to the haptic response that it would have received if it had touched the same area of the interface surface before reconfiguration.
  4. 4. The electronic device of any of claims 1 to 3, wherein the user interface further comprises a display for optically representing data items on the interface screen. -17-
  5. 5. The electronic device of any of claims 1 to 4, wherein the different haptic responses that the haptic generator is operable to generate correspond to different selectable menu options.
  6. 6. The electronic device of any of claims 1 to 5, wherein the electronic device is a mobile telephone.
  7. 7. The electronic device of any of claims 1 to 6, wherein the haptic generator comprises an array of piezo electric crystals.
  8. 8. The electronic device of any of claims 1 to 7, wherein the processor is operable to: receive one or more signals from the detector indicating the position touched by the pointer; identify, from the received one or more signals, an active area that has been touched; determine a haptic response associated with the identified active area; and output one or more control signals to the haptic generator to cause the haptic generator to generate the determined response in the area touched by the pointer.
  9. 9. An electronic device comprising: a processor; and an interface screen for allowing a user to interact with the processor to control the electronic device, the interface screen comprising: a display for optically representing data items on the interlace screen; a haptic generator operable to provide haptic responses to a user interacting with the data items displayed on the display; and a navigation input for allowing a user to input navigation commands to navigate between a plurality of data items so that different data items are displayed on the display; wherein the processor has a non-haptic mode of operation in which said -18-haptic generator does not provide haptic responses to the user and a haptic mode of operation in which the haptic generator does provide said haptic responses to the user; wherein in the non-haptic mode, the processor is responsive to navigation commands received from the navigation input such that the number of data items navigated between depends upon the navigation command that is received; and wherein in the haptic mode, the processor is responsive to navigation commands received from the navigation input such that the number of data items navigated between is independent of the navigation command that is received.
  10. 10. An electronic device comprising: a processor; and an interface screen for allowing a user to interact with the processor to control the electronic device, the interface screen comprising: a display for optically representing data items on the interface screen; and a haptic generator operable to provide haptic responses to a user interacting with the interface screen; wherein the processor has a non-haptic mode of operation in which a plurality of data items are displayed on the display and in which said haptic generator does not provide haptic responses to the user; and a haptic mode of operation in which fewer data items are displayed on the display compared with the non-haptic mode and in which the haptic generator does provide said haptic responses to the user.
  11. 11. An electronic device comprising a user interface that is operable to provide different haptic responses to a user interacting with different parts of the interface.
  12. 12. An electronic device comprising a user interface and having a first mode in which data items are represented optically at the user interface and the number of data items that can be scrolled through depends upon a user supplied scrolling instruction and a second mode in which, in response to user interaction, data items -19-are represented haptically at the user interface and in which the number of data items that can be scrolled through does not depend on the scrolling instruction.
  13. 13. A method of delivering information to a user of an electronic device, the method comprising: detecting that one of a plurality of areas of a surface of the user device has been touched; selecting one of a plurality of haptic responses based upon the touched area; and providing a localised haptic response at the touched area in accordance with the selected haptic response.
  14. 14. The method of claim 13, further comprising: determining that an activation action has been performed; and performing a function associated with the touched area.
  15. 15. The method of claim 14 further comprising reconfiguring the electronic device so that, if the touched area is touched again, a different haptic response will be provided at the touched area.
  16. 16. The method of claim 13, 14, or 15, further comprising optically presenting different data items in different ones of the plurality of areas.
  17. 17. A method of delivering information to a user of an electronic device, the method comprising: in a non-haptic mode: presenting data items optically, receiving a navigation command, and navigating between a number of data items, wherein the number depends on the navigation command; in a haptic mode: -20 -representing data items haptically in response to user interaction, receiving a navigation command, and navigating between a number of data items, wherein the number does not depend on the navigation command; and switching between the haptic and non-haptic modes.
  18. 18. A method of delivering information to a user of an electronic device, the method comprising: in a non-haptic mode, optically representing a plurality of data entries on an interface screen of the electronic device; in a haptic mode, in response to user interaction, haptically providing a subset of the plurality of data entries; and switching between the haptic and non-haptic modes.
  19. 19. A method of delivering information to a user of an electronic device, the method comprising, in response to a user touching different parts of the electronic device, providing at the different parts of the device different haptic responses.
  20. 20. A computer implementable instructions product comprising computer implementable instructions for causing a programmable computer device to become configured as the electronic device of any of claims 1 to 12 or perform the method of any of claims l3to 19.
GB0908892A 2009-05-22 2009-05-22 Haptic information delivery Withdrawn GB2470418A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0908892A GB2470418A (en) 2009-05-22 2009-05-22 Haptic information delivery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0908892A GB2470418A (en) 2009-05-22 2009-05-22 Haptic information delivery

Publications (2)

Publication Number Publication Date
GB0908892D0 GB0908892D0 (en) 2009-07-01
GB2470418A true GB2470418A (en) 2010-11-24

Family

ID=40862874

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0908892A Withdrawn GB2470418A (en) 2009-05-22 2009-05-22 Haptic information delivery

Country Status (1)

Country Link
GB (1) GB2470418A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2492782A1 (en) * 2011-02-28 2012-08-29 Research In Motion Limited Patterned activation of piezoelectric actuators
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013169279A1 (en) * 2012-05-11 2013-11-14 Touchsensor Technologies, Llc Sensory output system, apparatus and method
CN105992993A (en) * 2013-12-19 2016-10-05 詹尼弗·艾莉森·怀尔德 A user interface

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060049920A1 (en) * 2004-09-09 2006-03-09 Sadler Daniel J Handheld device having multiple localized force feedback
US20060103634A1 (en) * 2004-11-17 2006-05-18 Samsung Electronics Co., Ltd. Apparatus and method of providing fingertip haptics of visual information using electro-active polymer for image display device
US20070146316A1 (en) * 2002-01-28 2007-06-28 Sony Corporation Mobile apparatus having tactile feedback function
WO2008150600A1 (en) * 2007-06-05 2008-12-11 Immersion Corporation Method and apparatus for haptic enabled flexible touch sensitive surface
US20090002140A1 (en) * 2007-06-29 2009-01-01 Verizon Data Services, Inc. Haptic Computer Interface
US20090085878A1 (en) * 2007-09-28 2009-04-02 Immersion Corporation Multi-Touch Device Having Dynamic Haptic Effects

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070146316A1 (en) * 2002-01-28 2007-06-28 Sony Corporation Mobile apparatus having tactile feedback function
US20060049920A1 (en) * 2004-09-09 2006-03-09 Sadler Daniel J Handheld device having multiple localized force feedback
US20060103634A1 (en) * 2004-11-17 2006-05-18 Samsung Electronics Co., Ltd. Apparatus and method of providing fingertip haptics of visual information using electro-active polymer for image display device
WO2008150600A1 (en) * 2007-06-05 2008-12-11 Immersion Corporation Method and apparatus for haptic enabled flexible touch sensitive surface
US20090002140A1 (en) * 2007-06-29 2009-01-01 Verizon Data Services, Inc. Haptic Computer Interface
US20090085878A1 (en) * 2007-09-28 2009-04-02 Immersion Corporation Multi-Touch Device Having Dynamic Haptic Effects

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2492782A1 (en) * 2011-02-28 2012-08-29 Research In Motion Limited Patterned activation of piezoelectric actuators
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013169279A1 (en) * 2012-05-11 2013-11-14 Touchsensor Technologies, Llc Sensory output system, apparatus and method
CN105992993A (en) * 2013-12-19 2016-10-05 詹尼弗·艾莉森·怀尔德 A user interface
CN105992993B (en) * 2013-12-19 2019-06-25 詹尼弗·艾莉森·怀尔德 User interface

Also Published As

Publication number Publication date
GB0908892D0 (en) 2009-07-01

Similar Documents

Publication Publication Date Title
KR101070111B1 (en) Hand held electronic device with multiple touch sensing devices
US9235267B2 (en) Multi touch with multi haptics
US8963882B2 (en) Multi-touch device having dynamic haptic effects
EP3101519B1 (en) Systems and methods for providing a user interface
KR101701492B1 (en) Terminal and method for displaying data thereof
KR101510738B1 (en) Apparatus and method for composing idle screen in a portable terminal
CN108279741B (en) Handheld computing device
MX2008014057A (en) Multi-function key with scrolling.
EP2383631A1 (en) Hand-held mobile device and method for operating the hand-held mobile device
US20030095105A1 (en) Extended keyboard
JP5718475B2 (en) Tactile presentation device
WO2008120049A2 (en) Method for providing tactile feedback for touch-based input device
CN114008569A (en) Method and apparatus for configuring a plurality of virtual buttons on a device
US20230359351A1 (en) Virtual keyboard processing method and related device
CN114764304A (en) Screen display method
GB2470418A (en) Haptic information delivery
KR20130063190A (en) Method and apparatus for displaying task management screen of mobile terminal comprising touch screen
US20230359279A1 (en) Feedback method and related device
EP2930604B1 (en) Causing feedback to a user input

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)