WO2013068793A1 - Procédé, appareil, programme d'ordinateur et interface utilisateur - Google Patents

Procédé, appareil, programme d'ordinateur et interface utilisateur Download PDF

Info

Publication number
WO2013068793A1
WO2013068793A1 PCT/IB2011/055048 IB2011055048W WO2013068793A1 WO 2013068793 A1 WO2013068793 A1 WO 2013068793A1 IB 2011055048 W IB2011055048 W IB 2011055048W WO 2013068793 A1 WO2013068793 A1 WO 2013068793A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch sensitive
sensitive display
user interface
user
mode
Prior art date
Application number
PCT/IB2011/055048
Other languages
English (en)
Inventor
Seppo Kalervo PUOLITAIVAL
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/IB2011/055048 priority Critical patent/WO2013068793A1/fr
Publication of WO2013068793A1 publication Critical patent/WO2013068793A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/56Details of telephonic subscriber devices including a user help function

Definitions

  • Embodiments of the present disclosure relate to a method, apparatus, computer program and user interface.
  • they relate to a method, apparatus, computer program and user interface which is convenient for a user to use without viewing the apparatus.
  • Apparatus comprising touch sensitive displays which enable the user to control the apparatus are well known.
  • apparatus such as mobile telephones or tablet computers or satellite navigation apparatus may have a user interface which comprises a touch sensitive display.
  • problems may arise if the user needs to actuate the touch sensitive display but is unable to view the display. For instance, if the user is carrying out more than one task simultaneously they might not be able to view the apparatus properly. As an example, if the user is driving or walking while using the apparatus they would need to be looking at where they are driving or walking rather than the apparatus. Alternatively a user may be visually impaired and might not be capable of visually distinguishing between the items on the touch sensitive display particularly.
  • a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
  • the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
  • the touch sensitive display may be configured in the second mode of operation in response to a user input.
  • the user input may comprise actuating a designated portion of the touch sensitive display.
  • the user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
  • the function associated with the simultaneously actuated user interface item may be enabled.
  • the function associated with the simultaneously actuated user interface item might not be enabled.
  • the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display. In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
  • the user interface items may comprise user selectable icons.
  • the plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
  • the method may further comprise configuring the touch sensitive display back into the first mode of operation in response to a further user input.
  • an apparatus comprising: at least one processor; and at least one memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, enable the apparatus to: provide a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configure the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
  • the non-visual output may comprise an audio output. In some embodiments the non-visual output may comprise a tactile output.
  • the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect a user input and configure the touch sensitive display in the second mode of operation in response to the detection of the user input.
  • the user input may comprise actuating a designated portion of the touch sensitive display.
  • the user input may comprise actuating the designated portion of the touch sensitive display simultaneously to actuating the portion of the touch sensitive display in which the user interface items are displayed.
  • the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, and in response to detecting the end of the actuation of the designated portion of the touch sensitive display, enable the function associated with the simultaneously actuated user interface item to be performed.
  • the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect when the user ends the actuation of the designated portion of the touch sensitive display, wherein in response to detecting the end of the actuation of the designated portion of the touch sensitive display, the function associated with the simultaneously actuated user interface item is not performed.
  • the designated portion of the touch sensitive display may comprise a corner of the touch sensitive display. In some embodiments a plurality of designated portions of the touch sensitive display may be provided.
  • the user interface items may comprise user selectable icons.
  • the plurality of user selectable icons may be provided in a different portion of the touch sensitive display to the designated area.
  • the at least one memory and the computer program code may be configured to, with the at least one processor, enable the apparatus to detect a further user input and configure the touch sensitive display back in the first mode of operation in response to the further user input.
  • a computer program comprising computer program instructions that, when executed by at least one processor, enable an apparatus at least to perform: providing a plurality of user interface items on a portion of a touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and configuring the touch sensitive display into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
  • a computer program comprising program instructions for causing a computer to perform the method as described above.
  • an electromagnetic carrier signal carrying the computer program as described above there may also be provided an electromagnetic carrier signal carrying the computer program as described above.
  • a user interface comprising: a touch sensitive display wherein the touch sensitive display is configured to; provide a plurality of user interface items on a portion of the touch sensitive display wherein in a first mode of operation actuation of an area of the touch sensitive display in which a user interface item is displayed causes a function associated with the user interface item to be performed; and when configured in a second mode of operation cause a non-visual output indicative of the function associated with a user interface item in response to actuation of an area of the touch sensitive display in which a user interface item is displayed without enabling the function.
  • the non-visual output may comprise an audio output.
  • the non-visual output may comprise a tactile output.
  • the touch sensitive display may be configured in the second mode of operation in response to a user input.
  • the user input may comprise actuating a designated portion of the touch sensitive display.
  • the apparatus may be for wireless communication. BRIEF DESCRIPTION
  • Fig. 1 schematically illustrates an apparatus according to an examplary embodiment of the disclosure
  • Fig. 2 schematically illustrates an apparatus according to another examplary embodiment of the disclosure
  • FIGs. 3A and 3B are block diagrams which schematically illustrate methods according to an examplary embodiment of the disclosure.
  • Figs. 4A to 4D illustrate graphical user interfaces according to an examplary embodiment of the disclosure.
  • the Figures illustrate a method comprising: providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item to be provided without enabling the function.
  • Fig. 1 schematically illustrates an apparatus 1 according to an embodiment of the disclosure.
  • the apparatus 1 may be an electronic apparatus.
  • the apparatus 1 may be, for example, a mobile cellular telephone, a tablet computer, a personal computer, a camera, a gaming device, a personal digital assistant, a personal music player, an electronic reader or any other apparatus which may enable a user to make inputs via a touch sensitive display.
  • the apparatus 1 may be a handheld apparatus 1 which can be carried in a user's hand, handbag or jacket pocket for example.
  • the apparatus 1 may comprise additional features that are not illustrated.
  • the apparatus 1 may comprise one or more transmitters and receivers.
  • the apparatus 1 illustrated in Fig. 1 comprises: a user interface 13 and a controller 4.
  • the controller 4 comprises at least one processor 3 and at least one memory 5 and the user interface 13 comprises a touch sensitive display 15 and a user input device 17.
  • the controller 4 provides means for controlling the apparatus 1 .
  • the controller 4 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions 1 1 in one or more general-purpose or special-purpose processors 3 that may be stored on a computer readable storage medium 23 (e.g. disk, memory etc.) to be executed by such processors 3.
  • a computer readable storage medium 23 e.g. disk, memory etc.
  • the controller 4 may be configured to control the apparatus 1 to perform functions.
  • the functions may comprise, for example, communications functions such as telephone calls, email services or messages such as SMS (short message service) messages, MMS (multimedia message service) messages or instant messages, or access to social networking applications or any other communications functions, media functions such as access to image capturing devices, stored images or videos or data files, access to music or other audio files or any other media functions.
  • Other functions may include games or calendar applications or access to services such as satellite navigation systems.
  • the controller 4 may also be configured to enable the apparatus 1 to perform a method comprising: providing a plurality of user interface items on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function.
  • the at least one processor 3 is configured to receive input commands from the user interface 13 and also to provide output commands to the user interface 13.
  • the at least one processor 3 is also configured to write to and read from the at least one memory 5. Outputs of the user interface 13 may be provided as inputs to the controller 4.
  • the user input device 17 provides means for enabling a user of the apparatus 1 to input information which may be used to control the apparatus 1 .
  • the user input device 17 may comprise any means which enables a user to input information into the apparatus 1 .
  • the user input device 17 may comprise a touch sensitive display 15 or a portion of a touch sensitive display 15, a key pad, an accelerometer or other means configured to detect orientation and/or movement of the apparatus 1 , audio input means which enable an audio input signal to be detected and converted into a control signal for the controller 4 or a combination of different types of user input devices.
  • the touch sensitive display 15 may be actuated by a user contacting the surface of the touch sensitive display 15 with an object such as their finger or a stylus. A user may contact the surface of the touch sensitive display 15 by physically touching the surface of the touch sensitive display 15 with an object or by bringing the object close enough to the surface to activate the sensors of the touch sensitive display 15.
  • the touch sensitive display 15 may comprises a capacitive touch sensitive display, or a resistive touch sensitive display 15 or any other suitable means for detecting a touch input.
  • the touch sensitive display 15 may comprise any means which enables information to be displayed to a user of the apparatus 1 .
  • the information which is displayed may comprise a plurality of user interface items 53.
  • the user interface items may enable the user of the apparatus 1 to control the apparatus 1 .
  • the user interface items 53 may comprise user selectable icons 61 .
  • the user selectable icons 61 may be associated with functions of the apparatus 1 so that user selection of the icon causes the associated function to be performed.
  • the user interface 13 comprises a touch sensitive display 15 the user selectable icons 61 may be selected by actuating the area of the touch sensitive display 15 in which the user selectable icon 61 is displayed.
  • the touch sensitive display 15 may be configured to display graphical user interfaces 51 as illustrated in Figs. 4A to 4D.
  • the at least one memory 5 is configured to store a computer program 9 comprising computer program instructions 1 1 that control the operation of the apparatus 1 when loaded into the at least one processor 3.
  • the computer program instructions 1 1 provide the logic and routines that enable the apparatus 1 to perform the examplary methods illustrated in Figs. 3A and 3B.
  • the at least one processor 3 by reading the at least one memory 5 is able to load and execute the computer program 9.
  • the computer program instructions 1 1 may provide computer readable program means configured to control the apparatus 1 .
  • the program instructions 1 1 may provide, when loaded into the controller 4; means for providing a plurality of user interface items 53 on a portion of a touch sensitive display 15 wherein in a first mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a function associated with the user interface item 53 to be performed; and means for configuring the touch sensitive display 15 into a second mode of operation wherein in the second mode of operation actuation of an area of the touch sensitive display 15 in which a user interface item 53 is displayed causes a non-visual output indicative of the function associated with a user interface item 53 to be provided without enabling the function.
  • the computer program 9 may arrive at the apparatus 1 via any suitable delivery mechanism 21 .
  • the delivery mechanism 21 may comprise, for example, a computer-readable storage medium, a computer program product 23, a memory device, a record medium such as a CD-ROM or DVD, an article of manufacture that tangibly embodies the computer program 9.
  • the delivery mechanism may be a signal configured to reliably transfer the computer program 9.
  • the apparatus 1 may propagate or transmit the computer program 9 as a computer data signal.
  • the memory 5 may comprise a single component or it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage.
  • references to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor' etc. should be understood to encompass not only computers having different architectures such as single/multi- processor architectures and sequential (e.g. Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific integration circuits (ASIC), signal processing devices and other devices. References to computer program, instructions, code etc.
  • Fig. 2 illustrates an apparatus V according to another embodiment of the disclosure.
  • the apparatus 1 ' illustrated in Fig. 2 may be a chip or a chip-set.
  • the apparatus 1 ' comprises at least one processor 3 and at least one memory 5 as described above in relation to Fig. 1 .
  • Figs. 3A and 3B schematically illustrate methods according to examplary embodiments of the disclosure which may be performed by the apparatus 1 as illustrated in Figs. 1 and 2.
  • Fig. 3A illustrates an examplary method of configuring an apparatus 1 into a mode of operation which enables a user to control the apparatus 1 without having to view the apparatus 1 .
  • a touch sensitive display 15 is configured in a first mode of operation.
  • the first mode of operation may be a default mode of operation.
  • a plurality of user interface items 53 are displayed on the touch sensitive display 15.
  • the user interface items 53 are associated with functions of the apparatus 1 so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed causes the function associated with that user interface item 53 to be performed.
  • the actuation required to cause a function to be performed may be a specific type of user input, for example it may comprise actuating the area in which the user interface item 53 is displayed for at least a minimum period of time or making multiple inputs in the same area, for example, a tap input, a double tap input, a trace or swipe gesture input, a flick gesture input or any other suitable input.
  • the user input may comprise any input which is made by the user and which is detected by the user input means 17.
  • the user input may be an input which a user can make without viewing the apparatus 1 .
  • it may comprise physical actuation of a portion of the apparatus 1 which is easy for the user to find without viewing the apparatus 1 .
  • the user input may comprise actuation of a designated portion 55, 59 of the touch sensitive display 15.
  • the designated portion 55, 59 of the touch sensitive display 15 may comprise an edge portion or a corner portion of the touch sensitive display 15 which may be easy for a user to find without viewing the apparatus 1 .
  • user input may comprise actuation of a key.
  • the key may be provided in a location so that it is easy for a user to find without looking at the apparatus 1 , for example, it may be located on a side of the apparatus 1 and not positioned adjacent to or in proximity to any other keys.
  • the user input may comprise a different type of input, for example it may comprise an audio input which may enable voice control of the apparatus 1 .
  • the user input may comprise an input which may be made to any part of the apparatus 1 , for example, a shaking or movement of the apparatus 1 which may be detected by an accelerometer.
  • the user input means 17 provides an output signal to the controller 4 indicative of the detected input and at block 35 the apparatus 1 is configured into a second mode of operation.
  • the touch sensitive display 15 is configured so that actuation of the area of the touch sensitive display 15 in which a user interface item 53 is displayed does not cause the function associated with that user interface item to be performed. Instead a non-visual output is provided indicative of the function associated with the user interface item 53.
  • the non-visual output may comprise any output which can be detected by the user without viewing the apparatus 1 .
  • the non-visual output may comprise an audio output.
  • the audio output may comprise an acoustic signal which may be provided by an audio output device such as a loudspeaker.
  • the acoustic signal may comprise a pressure wave which may be detected by the user, for example by the user's ear.
  • the audio output device may be configured to provide the audio output in response to an electrical input signal provided by the controller 4.
  • the electrical input signal may contain information indicative of the audio output which is to be provided.
  • the non-visual output may comprise a tactile output.
  • the tactile output may be provided instead of or in addition to an audio output.
  • the tactile output may comprise any output which the user of the apparatus 1 may sense through touch.
  • the tactile output may comprise a raised portion or an indented portion of the touch sensitive display 15, a change in the texture of the surface or part of the touch sensitive display 15, or a vibration of the apparatus 1 or part of the apparatus 1 .
  • the tactile output may be provided as localized projections or indentations in the surface of the touch sensitive display 15.
  • the projections and indentations may be provided by any suitable means such as a layer of electroactive polymer, a mechanical or fluid pumping system or a piezo electric transducer.
  • the tactile output may be provided by providing a stimulus such as an electrical impulse to the user rather than making a physical modification to the surface of the touch sensitive display 15 or any other part of the apparatus 1 .
  • a current may be provided to the touch sensitive display 15 to change the electrical charge of the touch sensitive display 15. The user may be able to sense this through their skin and so be provided with a tactile output.
  • 3B illustrates an examplary method in which the user actuates the touch sensitive display 15 when the apparatus 1 is configured in the second mode of operation as described above.
  • the user actuates the area of the touch sensitive display 15 in which a user interface item 53 is displayed.
  • the user interface item 53 may be normally associated with a function such that in a normal or default mode of operation actuation of the area in which the user interface item 53 is displayed causes the function to be performed.
  • an output signal is provided to the controller 4 indicative of the user interface item 53 which has been actuated.
  • the controller 4 controls the apparatus 1 to provide a non-visual input indicative of the function which is normally associated with the user interface item.
  • the non-visual output may comprise an audio output and/or a tactile output.
  • the non-visual output may provide sufficient information to the user of the apparatus 1 to enable them to determine the function normally associated with the user interface item 53 without having to view the apparatus 1 .
  • the non-visual output indicative of the function is performed without causing the function itself to be performed. If the user wishes to enable the function to be performed they may need to make a further user input.
  • the touch sensitive display may be configured back into the first mode of operation in response to a further user input.
  • a user may be able to make a user input, to indicate that they wish the function indicated by the non-visual output to be performed.
  • Figs. 4A to 4D illustrate graphical user interfaces 51 according to an examplary embodiment of the disclosure.
  • the graphical user interfaces 51 may be displayed on a touch sensitive display 15 of an apparatus 1 as described above.
  • the graphical user interface 51 comprises a plurality of user interface items 53.
  • the plurality of graphical user interface items 53 may comprise any item displayed on the touch sensitive display 15.
  • the plurality of user interface items 53 comprises a plurality of user selectable icons 61 .
  • the plurality of user selectable icons 61 comprise graphical items which are associated with functions of the apparatus 1 such that, in a normal mode of operation, actuation of the area in which the respective user selectable icon is displayed causes the associated function to be performed.
  • the plurality of user selectable icons 61 may be arranged in a menu structure or on a home screen. The user may be able to navigate through the plurality of user selectable icons 61 by scrolling through the menu structure or home screen.
  • the plurality of user selectable icons 61 which are provided may be dynamic, that is they are not necessarily provided in a fixed position on the touch sensitive display 15.
  • the user selectable icons 61 which are provided and their respective position may depend on a plurality of factors such as the mode of operation of the apparatus 1 , the position of the menu or home screen to which the user has scrolled or navigated, the functions which are currently available on the apparatus or the mode of operation of the apparatus 1 . This may make it difficult for a user to locate a specific one of the plurality of user selectable icons 61 without viewing the touch sensitive display 15.
  • the plurality of user interface items 53 also comprises a plurality of designated portions 55, 57, 59 of the touch sensitive display 15.
  • three designated portions 55, 57, 59 are provided.
  • a first designated portion 55 is provided in the upper left hand corner of the touch sensitive display 15
  • a second designated portion 59 is provided in the upper right hand corner of the touch sensitive display 15
  • a third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15.
  • the designated portions 55, 57, 59 may be fixed with respect to the edge of the touch sensitive display 15 so that they do not move even when the user scrolls through the plurality of user selectable icons 61 .
  • the designated portions 55, 57, 59 may be located in any portion of the display 15 or even the housing of the apparatus 1 .
  • the first and second designated portions 55, 59 enable the touch sensitive display 15 to be configured into a second mode of operation as described in more detail below.
  • the third designated portion 57 is provided in the centre of the upper side of the touch sensitive display 15 and is associated with a reader function such that the actuation of the third designated portion 57 causes the apparatus 1 to provide an audio output indicating all the user interface items currently displayed on the touch sensitive display 15. This may be an advantageous feature if the user of the apparatus 1 is visually impaired.
  • user interface items 53 may also be provided which are not illustrated in Figs 4A to 4D.
  • user interface items 53 which provide information to a user but do not cause a function to be performed, for example a graphical item may be provided indicating the remaining power available in a power source or of an available signal strength or status of a communication link.
  • the plurality of user selectable icons 61 are provided in a different area of the touch sensitive display 15 to the designated portions 55, 57, 59. This may make it easier for a user to locate the respective user interface items 53.
  • the touch sensitive display 15 is configured in the first mode of operation so that if a user were to actuate the area in which any of the plurality of user selectable icons 61 is displayed the controller 4 would control the apparatus 1 to cause the associated function to be performed.
  • Fig. 4B the user actuates the first designated portion 55 of the touch sensitive display 15 by touching this portion with their finger 63. This causes the controller 4 to configure the touch sensitive display 15 into the second mode of operation as described above.
  • Fig. 4C the user maintains the actuation of the first designated portion 55 of the touch sensitive display 15 with the first finger 63 and simultaneously actuates the area of the touch sensitive display 15 in which a user selectable icon 61 is displayed with a second finger 67.
  • a non-visual output may be an audio output and/or a tactile output.
  • the content of the non-visual output may be selected by a user to provide them with information which enables them to easily determine the function associated with the user selectable icon 61 .
  • a visual output may be provided in addition to the non-visual output.
  • the user may actuate a plurality of different areas of the touch sensitive display 15 for example they may hold their first finger 63 fixed in the upper left hand corner of the touch sensitive display 15 and move their second finger 67 across the touch sensitive display 15. This may cause a plurality of different non-visual outputs to be provided as a plurality of different areas of the touch sensitive display 15 are actuated.
  • Fig. 4D the user ends the actuation of the first designated area 55 by lifting their finger off the surface of the touch sensitive display 15. In the embodiment of the disclosure illustrated in Figs. 4A to 4D this causes the touch sensitive display 15 to be switched back to the first normal mode of operation.
  • the user may also configure the touch sensitive display 15 into the second mode of operation by actuating the second designated portion 59.
  • the second designated portion 59 may operate in an identical manner to the first designated portion 55 so that in order to maintain the touch sensitive display 15 in the second mode of operation the user has to keep their finger in contact with the designated portion 55, 59. This may provide an advantage that the user could actuate either designated portion 55, 59, it may make it easier for the user find the designated portion 55, 59 and they could use whichever designated portion 55, 59 is most convenient for them. The user could then control the apparatus to cause a function to be performed by ending the actuation of the designated portion 55, 59 whilst maintaining the actuation of the area in which the user selectable icon 61 is displayed.
  • the two designated portions 55, 59 may have different modes of actuation.
  • the first designated portion 55 may be actuated by touching the first designated portion 55 and maintaining contact with the surface of the touch sensitive display 15 while simultaneously actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed.
  • the second designated portion 59 may be actuated by touching the second designated portion 59 and then subsequently actuating the area of the touch sensitive display 15 in which the plurality of user selectable icons 61 is displayed without having to maintain the simultaneous actuation of the second designated portion 59. This may enable a user to end the actuation of the designated portion 59 without causing the function to be performed. This may make the apparatus 1 easier for a user to use with one hand.
  • the plurality of user selectable icons 61 may be dynamic, that is their position on the touch sensitive display 15 is not fixed and they may be moved. In some embodiments the plurality of user selectable icons 61 may be moved in response to an input by a user, for example a user may make an input which causes the plurality of user selectable icons 61 to be scrolled through. The scrolling may continue until another user input or interrupt is detected. In other embodiments the movement of the plurality of user selectable icons 61 on the touch sensitive display 15 may be automatic, without any direct input from the user of the apparatus 1 .
  • the plurality of user selectable icons 61 may comprise items within a list of received messages such as emails or notifications from a social networking site.
  • the list of received messages or notifications may be automatically refreshed whenever a new message or notification is received. This may cause the items within the list to be moved on the display 15. When a user selects one of the items in the list this may enable the function of opening the message or notification or accessing information relating to the message or notification.
  • this may cause the dynamic items of the touch sensitive display 15 to be temporarily fixed so that they remain in the same position on the touch sensitive display 15. For example if a user is scrolling through the plurality of user selectable icons 61 and then actuates one of the designated portions 55, 57, 59, this may cause the scrolling to be suspended while the apparatus 1 is in the second mode of operation. The scrolling may be resumed once the apparatus 1 is configured out of the second mode of operation.
  • the plurality of user selectable icons 61 comprises items within a list of received messages such as emails or notifications from a social networking site
  • actuating one of the designated portions 55, 57, 59 and enabling the second mode of operation may cause the automatic refreshing of the list to be temporarily disabled. This may enable a user to obtain non-visual indications of new notifications and messages which have been received.
  • the non-visual indications of new notifications and messages which have been received may comprise an indication of the type of message or notification.
  • different non-visual outputs may be provided for different types of messages, such as emails, SMS or MMS messages, instant messages or notifications from social networking sites.
  • the non-visual indication may include further details such as the sender of the message or notification and the time and date at which the message or notification was received.
  • the non-visual indication may comprise a non-visual output of the whole received message or notification, for example, the text of the message may be converted into an audio output which may be provided in response to the user input.
  • the apparatus 1 may be configured in a locked mode of operation.
  • the touch sensitive display 15 In the locked mode of operation the touch sensitive display 15 may be configured so that it is not responsive to user inputs so that actuating an area of the touch sensitive display 15 in which a user selectable icon 61 is displayed would not enable the function associated with the user selectable icon to be performed.
  • the user When the apparatus 1 is in the locked mode of operation the user may still be able to make a user input to cause the apparatus 1 to be configured in the second mode of operation and enable the non-visual outputs to be provided.
  • the designated portions 55, 57, 59 of the touch sensitive display 15 may still be responsive to a user input to enable the apparatus 1 to be configured out of the locked mode of operation and into the second mode of operation. In some embodiments this may enable the user to control the apparatus 1 to provide a non-visual output indicative of a received message or notification without having to configure the apparatus 1 into the unlocked mode of operation.
  • Embodiments of the disclosure provide an apparatus 1 with a touch screen display 15 which can be configured to enable the apparatus 1 to be used without looking at the touch screen display 15.
  • Configuring the apparatus 1 into a mode of operation in which a non-visual output is provided but a function is not performed enables a user to easily find the user interface items 53 they need without having to look at the apparatus 1 . This could be useful for users who are performing other tasks, such as driving or walking, whilst using the apparatus 1 . It may also be useful for visually impaired users or user who may have difficulty viewing certain user interface items 53. Also the embodiments of the disclosure enable an apparatus to be easily switched between the respective modes of operation by user inputs such as actuating designated areas of the touch sensitive display or voice inputs. These inputs may be inputs which a user can make easily and accurately without looking at the apparatus 1 or the touch sensitive display 15.
  • every user selectable icon 61 may be associated with a non-visual output so that actuation of any of the respective areas in which the user selectable icons 61 are displayed causes a non-visual output to be provided. This may be particularly advantageous for apparatus where the user is visually impaired and may need assistance in finding distinguishing between the respective user interface items 53.
  • a subset of the plurality of user selectable icons 61 may be associated with a non-visual output. The user may be able to select which user selectable icons are to be associated with a non-visual output.
  • This may be particularly advantageous for apparatus where the user intends to use the apparatus while performing other tasks, for example if the user is using the apparatus as a navigation system or music player whilst driving they might find it useful to be able to find the user interface icons associated with these functions but might not be interested in the user interface icons associated with, for example, viewing captured or stored images, as these functions would be unlikely to be used while they are not able to look at the apparatus. This may make it easier for the user to find the user selectable items 61 they need.
  • the blocks illustrated in the Figs. 3A and 3B may represent steps in a method and/or sections of code in the computer program 9.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
  • the designated portions are provided in the upper left hand and upper right hand corners respectively.
  • the designated portion may comprise any corner of the touch sensitive display. This may provide the advantage that the user of the apparatus does not need to know the orientation of the apparatus in order to find a designated portion of the touch sensitive display. This may be particularly beneficial if the user is visually impaired.
  • the apparatus 1 may also be configured to provide a non-visual output for items which do not have a specific function associated with them.
  • the display 15 may comprise icons indicative of the status of the apparatus 1 for example, the battery power level of the apparatus 1 or the signal strength available to the apparatus 1 .
  • the apparatus may be configured to provide a non-visual output relating to these status indicators as well as the user selectable items.

Abstract

L'invention concerne un procédé, un appareil, un programme d'ordinateur et une interface utilisateur, le procédé consistant à : fournir une pluralité d'éléments d'interface utilisateur sur une partie d'un dispositif d'affichage tactile dans lequel, dans un premier mode de fonctionnement, l'activation d'une zone du dispositif d'affichage tactile dans laquelle un élément d'interface utilisateur est affiché amène une fonction associée à l'élément d'interface utilisateur à être réalisée ; et configurer le dispositif d'affichage tactile dans un second mode de fonctionnement dans lequel, dans le second mode de fonctionnement, l'activation d'une zone du dispositif d'affichage tactile dans laquelle un élément d'interface utilisateur est affiché amène une sortie non visuelle indicative de la fonction associée à un élément d'interface utilisateur à être fournie sans activer la fonction.
PCT/IB2011/055048 2011-11-11 2011-11-11 Procédé, appareil, programme d'ordinateur et interface utilisateur WO2013068793A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2011/055048 WO2013068793A1 (fr) 2011-11-11 2011-11-11 Procédé, appareil, programme d'ordinateur et interface utilisateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2011/055048 WO2013068793A1 (fr) 2011-11-11 2011-11-11 Procédé, appareil, programme d'ordinateur et interface utilisateur

Publications (1)

Publication Number Publication Date
WO2013068793A1 true WO2013068793A1 (fr) 2013-05-16

Family

ID=48288602

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/055048 WO2013068793A1 (fr) 2011-11-11 2011-11-11 Procédé, appareil, programme d'ordinateur et interface utilisateur

Country Status (1)

Country Link
WO (1) WO2013068793A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6532005B1 (en) * 1999-06-17 2003-03-11 Denso Corporation Audio positioning mechanism for a display
US20030234824A1 (en) * 2002-06-24 2003-12-25 Xerox Corporation System for audible feedback for touch screen displays
US20060046031A1 (en) * 2002-12-04 2006-03-02 Koninklijke Philips Electronics N.V. Graphic user interface having touch detectability
US20090167701A1 (en) * 2007-12-28 2009-07-02 Nokia Corporation Audio and tactile feedback based on visual environment
US20100079410A1 (en) * 2008-09-30 2010-04-01 Sony Ericsson Mobile Communications Ab Three-dimensional touch interface
WO2010116028A2 (fr) * 2009-04-06 2010-10-14 Aalto-Korkeakoulusäätiö Procédé servant à commander un appareil
GB2470418A (en) * 2009-05-22 2010-11-24 Nec Corp Haptic information delivery
US20110113328A1 (en) * 2009-11-12 2011-05-12 International Business Machines Corporation System and method to provide access for blind users on kiosks
US20110138284A1 (en) * 2009-12-03 2011-06-09 Microsoft Corporation Three-state touch input system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6532005B1 (en) * 1999-06-17 2003-03-11 Denso Corporation Audio positioning mechanism for a display
US20030234824A1 (en) * 2002-06-24 2003-12-25 Xerox Corporation System for audible feedback for touch screen displays
US20060046031A1 (en) * 2002-12-04 2006-03-02 Koninklijke Philips Electronics N.V. Graphic user interface having touch detectability
US20090167701A1 (en) * 2007-12-28 2009-07-02 Nokia Corporation Audio and tactile feedback based on visual environment
US20100079410A1 (en) * 2008-09-30 2010-04-01 Sony Ericsson Mobile Communications Ab Three-dimensional touch interface
WO2010116028A2 (fr) * 2009-04-06 2010-10-14 Aalto-Korkeakoulusäätiö Procédé servant à commander un appareil
GB2470418A (en) * 2009-05-22 2010-11-24 Nec Corp Haptic information delivery
US20110113328A1 (en) * 2009-11-12 2011-05-12 International Business Machines Corporation System and method to provide access for blind users on kiosks
US20110138284A1 (en) * 2009-12-03 2011-06-09 Microsoft Corporation Three-state touch input system

Similar Documents

Publication Publication Date Title
US11907013B2 (en) Continuity of applications across devices
US11675476B2 (en) User interfaces for widgets
US11477609B2 (en) User interfaces for location-related communications
US11481094B2 (en) User interfaces for location-related communications
EP2849047B1 (fr) Méthode et logiciel pour la facilitation de la sélection des objets dans un appareil électronique
US9257098B2 (en) Apparatus and methods for displaying second content in response to user inputs
EP2590049B1 (fr) Interface d'utilisateur
US20100088654A1 (en) Electronic device having a state aware touchscreen
KR20150031010A (ko) 잠금 화면 제공 장치 및 방법
EP3241100B1 (fr) Appareil et procédé de traitement des notifications sur un dispositif informatique mobile
US20230195237A1 (en) Navigating user interfaces using hand gestures
EP2564288B1 (fr) Appareil, procédé, programme informatique et interface utilisateur
US11438452B1 (en) Propagating context information in a privacy preserving manner
EP2564290B1 (fr) Appareil, procédé, programme informatique et interface utilisateur
KR102142699B1 (ko) 어플리케이션 운용 방법 및 그 전자 장치
JP5943856B2 (ja) 多面的なグラフィック・オブジェクトを有する携帯端末及び表示切替方法
WO2013068793A1 (fr) Procédé, appareil, programme d'ordinateur et interface utilisateur
RU2607611C2 (ru) Пользовательский интерфейс
KR20120094846A (ko) 전자 디바이스 및 이의 제어 방법
EP2629170A1 (fr) Dispositif électronique et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11875380

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11875380

Country of ref document: EP

Kind code of ref document: A1