WO2010116028A2 - Method for controlling an apparatus - Google Patents

Method for controlling an apparatus Download PDF

Info

Publication number
WO2010116028A2
WO2010116028A2 PCT/FI2010/050252 FI2010050252W WO2010116028A2 WO 2010116028 A2 WO2010116028 A2 WO 2010116028A2 FI 2010050252 W FI2010050252 W FI 2010050252W WO 2010116028 A2 WO2010116028 A2 WO 2010116028A2
Authority
WO
WIPO (PCT)
Prior art keywords
menu
item
relating
sector
information relating
Prior art date
Application number
PCT/FI2010/050252
Other languages
English (en)
French (fr)
Other versions
WO2010116028A3 (en
Inventor
Raine Kajastila
Tapio Lokki
Original Assignee
Aalto-Korkeakoulusäätiö
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aalto-Korkeakoulusäätiö filed Critical Aalto-Korkeakoulusäätiö
Publication of WO2010116028A2 publication Critical patent/WO2010116028A2/en
Publication of WO2010116028A3 publication Critical patent/WO2010116028A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the invention relates to a method for controlling an apparatus by a surface of the apparatus.
  • the invention also relates to an apparatus, which is controlled by a surface of the apparatus.
  • the invention relates to a computer program product for controlling the apparatus by the surface of the apparatus.
  • the invention relates to a carrier medium comprising the computer program product for controlling the apparatus by the surface of the apparatus.
  • a user controls his/her mobile device, which utilizes a three dimensional audio system in order to produce a response to the user, through a keypad.
  • the three dimensional audio system audibly indicates keypad strokes as they are displayed within the display of the device by varying pitch, tone, and/or volume. For example, a sound indicating one key has a first pitch, another pressed key has a second pitch differing from the first pitch, a third pressed key has a third pitch differing from the first and second pitches, and so on. Consequently, when a certain pitch is associated with a certain key, the user can audibly detect a pressed key.
  • the user of the device controls the device through a rotating ball displayed in a touch panel. The ball is rotated by touching the touch panel and the rotation of the ball, a direction and rotational speed, is represented by means of sounds.
  • One object of the invention is to provide a method for controlling an apparatus without a visual feedback.
  • the object of the invention is fulfilled by providing a method, wherein information relating to an interacted location on a surface of an apparatus is obtained, an item corresponding to the information relating to the interacted location on the surface of the apparatus is obtained, and an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus is provided.
  • the object of the invention is also fulfilled by providing an apparatus, which is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the inter- acted location on the surface of the apparatus, and provide an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • the object of the invention is also fulfilled by providing a computer program product, which, when the computer program product is run in a computer, obtains in- formation relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • a carrier medium which comprises a computer program product, which, when the computer program product is run in a computer, obtains information relating to an interacted location on a surface of an apparatus, obtains an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provides an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • a mobile device is controlled by a user who moves his/her finger(s) on the touch surface of the mobile device differ- ent directions for browsing a circular ego-centric auditory menu comprising one or more auditory menu item.
  • Targeted auditory menu items are indicated with speech or other sounds and reproduced from corresponding directions with a three dimensional audio.
  • the synthesised targeted auditory menu items are transmitted from the mobile device, which possibly does not comprise a visual display at all, to user's headphones so that a three dimensional auditory space is established around the user.
  • An embodiment of the present invention relates to a method according to independent claim 1.
  • an embodiment of the present invention relates to an apparatus according to independent claim 13.
  • an embodiment of the present invention relates to a computer program product according to independent claim 14.
  • an embodiment of the present invention relates to a carrier medium accord- ing to independent claim 15.
  • a method comprises obtaining information relating to an interacted location on a surface of an apparatus, obtaining an item corresponding to the information relating to the interacted location on the sur- face of the apparatus, and providing an audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus.
  • acted location refers to a certain location on the surface of the apparatus, which the user of the apparatus interacts somehow, e.g. by touching by means of a finger, hand, stylus, ordinary pen, or stick.
  • surface of an apparatus refers to a surface (component), which is capable of detecting user's interaction directed at the surface of the apparatus, e.g. resistively, by pressure, capacitively, or optically.
  • Such surface can be e.g. a visual display, auditory display, or tactile display.
  • apparatus refers to e.g. a mobile station, laptop, computer, digital video disc (DVD) device, set-top box, video recorder, sound reproduction equipment, household apparatus such as microwave oven, car stereo, or navigator.
  • item refers to e.g. an alphabetic character, number, function icon, which enables to e.g. dial, send a short message service (SMS) message, browse a phonebook or playlist, control playback, and perform a radio station surfing, menu, or link.
  • SMS short message service
  • the item is illustrated by means of the surface of the apparatus e.g. visually through the visual display, audibly through the auditory display, or by vibrations through the tactile display.
  • the method further comprises providing a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
  • the form of the menu which the apparatus illustrates visually and/or audibly to a user, can also be e.g. ellipse and square.
  • the method which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
  • the centre of the menu heedless of whether the menu is circular, ellipse, or square, can be placed freely in view of the fact that the menu completely fits on the surface of the apparatus.
  • the method further comprises receiving the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
  • the touch surface can be e.g. a touch screen or mere interaction surface, which is capable of receive user's contact by e.g. a finger, hand, or other suitable instrument.
  • the method comprises determining a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
  • the interaction location information defines the menu sector indicating the targeted item.
  • the method further comprises expanding the sector determining the item and neighbouring sectors of the menu, and tapering other sectors of the menu on the surface of the apparatus. After the user has "activated” the target item (the interacted sector) by touching, the interacted sector and possible its neighbouring sectors expand and the other sectors shrink in the menu.
  • the method which is disclosed in any of the previous embodiments, further comprises receiving selection information relating to the item determined by the expanded sector. The user selects the "activated" item again by touching by means of e.g. a finger, hand, stylus, ordinary pen, stick, or by raising a finger from the touch surface.
  • the method which is disclosed in any of the previous embodiments, further comprises performing a function relating to the selected item determined by the expanded sector. So, when the user has selected the item, the function, which the selected item indicates, is executed in the apparatus.
  • the method further comprises providing an audio feedback, tactile feedback, and/or visual feedback relating to the function, which relates to the selected item determined by the expanded sector.
  • the audio feedback to the user is provided e.g. by the loudspeaker of the apparatus, headphones or loudspeaker system connected to the apparatus.
  • the visual feedback for one, is provided e.g. through the visual display of the apparatus or a display connected to the apparatus.
  • the tactile feedback indicating the made selection is established e.g. through the apparatus or another device capable of provide a tactile interaction, which is connected to the apparatus.
  • the connection between the apparatus and another device establishing the feedback can be e.g. a cable connection, wireless connection, or the combination of the cable and wireless connections.
  • the method which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item cor- responding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
  • the three dimensional sound feedback through the headphones, or loudspeaker system is established so that the targeted or selected items are reproduced from the correct directions of the targeted or selected items in a three dimensional audio space. It is, of course, possible to use mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user.
  • the method which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech.
  • the activation or selection of the item has been established, information describing the activated item and/or selected item or mere selection information are synthesised to a speech or other sounds such as an earcorn, spearcorn, auditory icon, and mixing speech, and reproduced it to the user.
  • the method which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order.
  • Item locations in the circular menu can be defined by the alphabetical or numerical order, where menu items are located around the circular menu depending on their first (second, third, etc.) letter.
  • the menu items are always found from prior known locations.
  • the items are not merely and necessarily in the alphabetical order or numerical order, wherein the first letter or any other letter or a number defines the location of the item on the surface, but those can be placed to certain locations on the surface depending on the first or any other letter or number.
  • an apparatus is configured to obtain information relating to an interacted location on a surface of the apparatus, obtain an item corresponding to the information relating to the interacted location on the surface of the apparatus, and provide an audio feedback relating to the item corre- sponding to the information relating to the interacted location on the surface of the apparatus.
  • the apparatus which is disclosed in the previous embodiment, is configured to provide a circular menu comprising sectors, which each sector indicates one item, by the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, wherein a centre of the circular menu and points of the sectors locate in a centre of the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to receive the information relating to the interacted location of the surface of the apparatus by a touch surface illustrating the circular menu comprising the sectors.
  • the apparatus which is disclosed in any of the previous embodiments, is configured to determine a sector of the menu, which sector determines the item, by the information relating to the interacted location on the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to expand the sector determining the item and neighbouring sectors of the menu, and taper other sectors of the menu on the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to receive selection information relating to the item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to perform a function relating to the selected item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, is further configured to provide an audio feedback, tactile feedback, and/or visual feedback relating to the function, which re- lates to the selected item determined by the expanded sector.
  • the apparatus which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the func- tion, which relates to the selected item determined by the expanded sector, are provided by using a three dimensional sound, which indicates the location of the item in the menu of the surface of the apparatus.
  • the apparatus which is disclosed in any of the previous embodiments, wherein the audio feedback relating to the item corresponding to the information relating to the interacted location on the surface of the apparatus and/or the audio feedback relating to the function, which relates to the selected item determined by the expanded sector, are provided by speech.
  • the apparatus which is disclosed in any of the previous embodiments, wherein items locate around the circular menu in an alphabetical order or numerical order.
  • the method according to embodiments of the invention which utilises a touch sur- face input with auditory menus, allows a reasonably fast and accurate way to navigate circular auditory menus with a large number of menu items.
  • a user When a user is browsing the menu items, they are read-out loud and the sound is reproduced from the correct direction.
  • the fast browsing is enabled with a reactive interruptible audio design and the accuracy in selection is enhanced with the dynamic move- ment of the menu items and the expansion of the selection area.
  • the method according to embodiments of the invention can be used to handle all the basic controls of the modern mobile phone or a music player, which contains a touch surface. It is also possible to construct a small multi-functional device consisting only of a touch surface without a screen. Such robust devices without visual display can be inexpensive and have low energy consumption, but still offer the same functionalities as similar devices with a visual screen.
  • Figure 1 illustrates an exemplary view of an arrangement comprising a mobile device and an auditory user interface according to an advantageous embodiment of the invention
  • Figure 2 illustrates an exemplary view of an arrangement comprising a mobile device and a remote controlled device according to an advantageous embodiment of the invention
  • FIG. 3 illustrates an exemplary flowchart of a method according to an advantageous embodiment of the invention
  • Figure 4 illustrates exemplary views of auditory menus according to an advanta- geous embodiment of the invention
  • Figure 5 illustrate further exemplary views of auditory menus according to an advantageous embodiment of the invention
  • Figure 6 illustrates an exemplary view of a method according to an advantageous embodiment of the invention
  • Figure 7 illustrates an exemplary view of an apparatus according to an advantageous embodiment of the invention.
  • Figure 8 illustrates an exemplary view of a remote device to be controlled by a method according to an advantageous embodiment of the invention.
  • Figure 1 illustrates an arrangement 100, wherein a mobile device 1 10, e.g. a mobile station or PDA, comprising a user interface 120 such as a touch surface or touch screen implemented into the mobile device 1 10 for receiving control commands from a user (not shown in figure).
  • the processor of the mobile device runs a program application, e.g. an auditory user interface application, in the mobile device 110 in order to provide the auditory user interface, which comprises one or more auditory items 130a-130e to the user.
  • the auditory user interface enables that the mobile device 1 10 can comprise only a touch surface, whereupon it does not have a display, or it possibly has a touch screen including the display, but not any keyboard.
  • the mobile device 1 10 provides an auditory space around the user by means of a mobile device loudspeaker (not shown), headphones 140 (or one earphone), which has a wired or wireless connection 150 to the mobile device 1 10, or a loudspeaker, wherein at least one loudspeaker system (not shown) is established around the user.
  • the loudspeaker system is also connected to the mobile device 110 through a wireless or wired connection.
  • an auditory object 130a illustrating an icon which provides an access to e.g. a phonebook
  • the mobile device 1 10 comprises the display, the user has a better knowledge relating to the location of the icon.
  • the user interface application provides an auditory feedback through the mobile device loudspeaker, headphones 140, one earphone, or loudspeaker system, when the user targets to the desired auditory icon 130a.
  • the feedback is possible to provide by using a three dimensional sound, whereupon the feedback comes from the direction wherein the targeted (activated) icon locates.
  • mono or stereo sound instead of the three dimensional sound for providing the feedback to the user.
  • the feedback notifies the user that he/she can access to the phonebook, which the targeted icon 130a indicates, by selecting the icon 130a.
  • the user raises his/her finger on the touch surface 120 where the finger rests on the touch surface 120 after the icon activation, presses a button 160 or other suitable key from the mobile device, or provides the selection other way and by other means in order to access to the phonebook.
  • the application provides another audio feedback, which describes that the selection is made, by means of e.g. a fast replay.
  • the user can browse the phonebook for selecting the name of the person to whom he/she wants to call, and make a call by pointing some auditory objects 130a-13Oe, which enables calling, in the auditory menu.
  • the user can also provide e.g. a SMS message sending, playlist browsing, playback controlling, and radio station surfing.
  • FIG 2 is represented an arrangement 200, wherein a mobile device 210, which comprises a touch surface or touch screen 220 in order to receive control commands from a user 230, communicates with a device 240 such as a desktop computer having a display 250, keyboard 260, and loudspeakers 270a, 270b by using a wireless Bluetooth or infrared connection 280.
  • the computer 240 establishes an auditory user interface, which has auditory items such as auditory menus and auditory icons, around the user 230 by the loudspeakers 270a, 270b.
  • the computer 240 can run a graphical user interface comprising e.g. graphical menus and graphical icons displayed on the display 250.
  • the user 230 wants to control a jukebox application run by the computer 240 in order to select next song to be played from a song list, which includes hundreds of songs, he/she points the touch surface 220 in order to direct a song icon indicating a desired song from the auditory user interface of the jukebox application so that, if the desired song icon locates on the top left corner of the display 250 and/or the front left corner of the auditory user interface, the user 230 touches e.g. by his/her finger(s) the top left corner of the touch surface 220. By pointing the song icon the user 230 activates the song icon and the activation is indicated to the user 230 audibly by reproducing a sound through the loudspeakers 270a, 270b.
  • the user 230 raises his/her finger on the touch surface 220 where the finger rests after the activation or presses a button 290 belonging to the mobile device 210 in order to play (select) the desired song. Then, the loudspeakers 270a, 270b reproduce a sound indicating the selection and the desired song, if the song is next on a play list.
  • the mobile device 210 comprising the touch surface 220 for enabling a song selection can also be utilised as the remote controller of the television (not shown), which visually illustrates a circular menu on the display of the television.
  • the mobile device 210 can be used for e.g. channel selecting or volume level adjusting.
  • a computer program e.g. the above-mentioned jukebox application or a game application, which the computer 240 runs, is displayed on the display 220 in the mobile device 210 and sounds relating to the application are reproduced through the loudspeaker of the mobile device (not shown) or headphones (not shown) connected to the mobile device 210.
  • the connection between the mobile device 210 and the headphones is either through a wire or wireless connection.
  • the jukebox or game application is displayed on the computer display 250 and sounds are reproduced through the mobile device loudspeaker or headphones connected to the mobile device 210.
  • Figure 3 discloses, by means of an example only, a flow chart describing a method 300 according to one embodiment of the invention.
  • a mobile device and/or an application such as a user interface executing a method, is turned on and necessary stages such as a connection set up definition for e.g. external headphones and different parameters' initialisation relating to an auditory user interface, are provided.
  • a circular menu is established to a user in the mobile device by means of the touch surface of the mobile device in step 320.
  • the circular menu comprising items can be displayed to the user visually through a mobile device display, e.g. a touch screen, and/or audibly through a circular auditory menu comprising auditory items provided by the mobile device and the headphones.
  • a mobile device user touches a certain location on the touch surface (touch screen) of the mobile device by e.g. his/her finger in order to activate a menu item from the circular menu
  • the certain interacted location information which comprises e.g. the x and y coordinates of the finger touching or information relating to the interacted sector of the circular menu, is obtained resistively, capacitively, or by any other means in step 330.
  • step 340 the menu item is obtained on the grounds of the obtained interacted location information so that the menu item is defined directly by means of the interacted sector information or the coordinates information specifying the sector, which then defines the menu item.
  • the menu item activation is indicated to the user by an audio feedback relating to the activated menu item through the headphones.
  • the feedback can be established by using a three dimensional sound so that the feedback relating to the activated menu item is reproduced from the correct direction of the activated item.
  • Mono or stereo sound can also be used instead of the three dimensional sound in order to provide the feedback to the user.
  • the information describing the activated item is synthesised to a speech or other sounds and reproduced to the user.
  • recorded samples can be used instead of the synthesised speech.
  • the sector determining the activated item and its neighbouring sectors expand and the other sectors of the circular menu taper on the surface of the apparatus and in the auditory space around the user for helping the user to target his/her selection.
  • step 360 it is determined whether the mobile device receives through the touch surface selection information relating to the activated item.
  • the selection information is indicated e.g. by raising the finger from the touch surface of the mobile device.
  • the method returns back to step 320.
  • the mobile device Since the selection is established, the mobile device performs a function, which the activated menu item indicates, in step 370.
  • a function can be e.g. make a call, place the call on hold, or disconnect the call.
  • the performed function or function to be performed is indicated to the user by an audio, visual, and/or tactile feedback, which relates to the selected menu item. This step is usually provided together with step 370.
  • control method is ended in step 390.
  • the method is possible to execute so that the mobile device obtains the interacted location information and transfers the interacted location information to e.g. the stereo set enabling to produce an audi- tory user interface around the user, which obtains the item on the grounds of the obtained interacted location information and provides the audio feedback relating to the activated item to the user of the mobile device.
  • a remote device such as a stereo set or computer having loudspeakers
  • the mobile device obtains the item according to step 340 of the previous embodiment and transfers the item information to the stereo set so that it can reproduce the audio feedback indicating the activated item.
  • the mobile device determines whether the selection of the activated item is executed and, when it determines that the user selects the activated item, it transfer the selection information to the stereo set performing the function indicated by the activated and selected item.
  • the ste- reo set also provides the audio feedback relating to the selected item to the user.
  • the above described control methods utilise an auditory user interface.
  • the auditory menu containing e.g. a dial menu, which enables dialling, listening to, and removing selected numbers
  • a SMS menu which contains all letters and a "space" for writing and an icon for sending a written message
  • phonebook which comprises an alphabetically organised phonebook.
  • the phonebook can include multiple layers, e.g. alphabets and for each of them a submenu with names.
  • a menu browsing happens by e.g. circular finger sweeping on a touch surface, as it is mentioned earlier, amongst the menu items, which are spread evenly on the surrounding circular space. Depending on the number of the items the space between them is dynamically changed.
  • Figure 4 discloses a circular menu 410 and square menu 420 according to an embodiment of the invention, wherein menu centres and sector points locate in the centre of the surface.
  • the circular menu 410 and square menu 420 comprising similarly alphabetic character icons and a "space" icon.
  • Lower menus 410, 420 represent how touch surfaces are divided into sectors having an equal point angle and how each sector comprises one icon. This figure shows one example how to the positions of the alphabetic characters can be implemented around the user in the static auditory menu.
  • the user can access any item directly by placing the finger on the touch surface and he/she can continue browsing with a circular finger sweep. Selection is made by removing the finger from the surface.
  • the centre of the touch surface is a safe area from where the finger can be lifted without making a selection.
  • other special actions can be assigned to the centre, corners, or any other specific location of the touch surface.
  • the menus 410, 420 can utilise the static placement of the menu items, which suits well for three dimensional menus where the items are always at known loca- tions. Thus, each item can be accessed directly by touching a known location and the auditory feedback of the targeted items reveals if the right item was activated, and if not, the static order of the items reveals where the desired item will be found since item locations are easy to remember with the longer use of the menu.
  • the menu items are separate items that monitor if an "interaction" is in their de- fined sector. Targeted items send info to other items in order to manage the positions of the other items, fade their sound, and adjust their sector sizes.
  • Figure 5 illustrates a circular menu 510 and square menu 520, wherein sectors are dynamically adjusted for enhancing user's pointing accuracy.
  • the dynamic zoom of the target sector reduces undesired jumping between items and facilitates the item selection with bigger number of items.
  • the user touches a touch surface by his/her finger in order to activate an icon, which indicates a letter R and when the apparatus (application) determines an interaction it expands the activated target sector R to both directions for enabling stabile browsing and selecting.
  • neighbouring sectors S, T, U, Q, P, and O also expand and other sectors, for one, regroup so that the sectors taper.
  • lower menus 510, 520 represent how the part of the sectors expand and the other sectors regroup.
  • the touch surface can comprise also a touch area in the middle of the touch surface and/or touchable areas around the sectors.
  • Figure 6 discloses a further enhancement for auditory menu layouts in order to achieve faster and better usability.
  • a start place is defined to be always a first name in alphabetical order.
  • the advanced spreading method can be adapted also to menus with multiple lay- ers and small menus with only few menu items.
  • An example relating to a browsing a small contact list is depicted in figure 6.
  • a first menu level containing the alphabets can be enhanced with the advanced spreading to gain easier access to the menu items.
  • names can be positioned so that they are always found according to a second letter. For example, Aaron is always posi- tioned in front and Amber behind in an auditory menu. When the user touches the screen, a closest menu item becomes active and the rest is spread evenly around them. If the menu happens to have only four items, the series of four can be repeated when the user continues browsing the menu as it is presented in the centre menu 620. The user can always return to the absolute positioning of the menu items by moving the finger to the centre of the surface (screen) as a menu 630 on the right shows.
  • Figure 7 discloses one example of a mobile device 700 according to an embodiment of the invention.
  • the mobile device 700 comprises processor 710 for performing instructions and handling data, a memory unit 720 for storing data such as instructions and application data, a user interface 730, which can comprise at least a touch surface or touch screen.
  • the user interface 730 can also comprise be e.g. a single button, a keyboard, or other selection means.
  • the mobile device 700 comprises data transfer means 740 for transmitting and receiving data and a loudspeaker 750.
  • the mobile device can also comprise a display 760 for providing a visual or tactile feedback, but not necessary.
  • the memory 720 stores at least an auditory user interface application 722, application 724 for determining interacted location data, and synthesizer application 726.
  • the implemented touch surface 730 obtains the interacted location data, which the processor 710 manipulates according to the instructions of the corresponding application 724, and the synthesizer 726 converts obtained data from text format to speech, which is established through the loudspeaker 750 e.g. by using the three dimensional sound.
  • Figure 8 discloses a device 800, which is controlled through an air interface by a mobile device, which is capable of receiving control inputs by a touch surface or a touch screen.
  • a mobile device which is capable of receiving control inputs by a touch surface or a touch screen.
  • Such device 800 can be e.g. a mobile station, computer, laptop, DVD recorder, personal computer, stereo set etc.
  • the device 800 comprises a processor 810 for performing instructions and handling data, a mem- ory 820 for storing data such as instructions and application data, a user interface 830 comprising e.g. touchpad, keyboard, or one or more buttons, at least data receiving means 840 for receiving data, a loudspeaker 850 and a display 860, but not necessary.
  • the device 800 can comprise data transmitting means 842 for sending data to an external loudspeaker (not shown).
  • the memory of the device 800 includes e.g. at least an auditory user interface application 822, application 824 for manipulating interacted location data or item data, and synthesizer application 826.
  • the processor 810 controls the received interacted location data or item data according to the instructions of the corresponding application 724, and the synthesizer 826 converts obtained data from text for- mat to speech, which is provided through the loudspeaker 850 e.g. by using the three dimensional sound.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Investigating Or Analyzing Materials By The Use Of Fluid Adsorption Or Reactions (AREA)
PCT/FI2010/050252 2009-04-06 2010-03-30 Method for controlling an apparatus WO2010116028A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20095376A FI20095376A (fi) 2009-04-06 2009-04-06 Menetelmä laitteen ohjaamiseksi
FI20095376 2009-04-06

Publications (2)

Publication Number Publication Date
WO2010116028A2 true WO2010116028A2 (en) 2010-10-14
WO2010116028A3 WO2010116028A3 (en) 2010-12-16

Family

ID=40590258

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/050252 WO2010116028A2 (en) 2009-04-06 2010-03-30 Method for controlling an apparatus

Country Status (2)

Country Link
FI (1) FI20095376A (fi)
WO (1) WO2010116028A2 (fi)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013093566A1 (en) * 2011-12-22 2013-06-27 Nokia Corporation An audio-visual interface for apparatus
EP2613522A1 (en) * 2012-01-06 2013-07-10 Samsung Electronics Co., Ltd. Method and apparatus for on-screen channel selection
WO2014100839A1 (en) * 2012-12-19 2014-06-26 Willem Morkel Van Der Westhuizen User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
EP2761419A1 (en) * 2011-09-30 2014-08-06 Van Der Westhuizen, Willem Morkel Method for human-computer interaction on a graphical user interface (gui)
CN106814966A (zh) * 2017-01-24 2017-06-09 腾讯科技(深圳)有限公司 一种控制对象的方法及装置
CN108139811A (zh) * 2015-10-15 2018-06-08 三星电子株式会社 记录执行屏幕的方法和处理该方法的电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6983251B1 (en) * 1999-02-15 2006-01-03 Sharp Kabushiki Kaisha Information selection apparatus selecting desired information from plurality of audio information by mainly using audio
FI118100B (fi) * 2005-02-07 2007-06-29 Ilpo Kojo Valitsin
US8098856B2 (en) * 2006-06-22 2012-01-17 Sony Ericsson Mobile Communications Ab Wireless communications devices with three dimensional audio systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2761419A1 (en) * 2011-09-30 2014-08-06 Van Der Westhuizen, Willem Morkel Method for human-computer interaction on a graphical user interface (gui)
WO2013068793A1 (en) * 2011-11-11 2013-05-16 Nokia Corporation A method, apparatus, computer program and user interface
WO2013093566A1 (en) * 2011-12-22 2013-06-27 Nokia Corporation An audio-visual interface for apparatus
US9632744B2 (en) 2011-12-22 2017-04-25 Nokia Technologies Oy Audio-visual interface for apparatus
EP2613522A1 (en) * 2012-01-06 2013-07-10 Samsung Electronics Co., Ltd. Method and apparatus for on-screen channel selection
WO2014100839A1 (en) * 2012-12-19 2014-06-26 Willem Morkel Van Der Westhuizen User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
CN105190508A (zh) * 2012-12-19 2015-12-23 瑞艾利缇盖特(Pty)有限公司 图形用户界面中的导航速度和获取容易度之间的权衡的用户控制
US10732813B2 (en) 2012-12-19 2020-08-04 Flow Labs, Inc. User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface
CN108139811A (zh) * 2015-10-15 2018-06-08 三星电子株式会社 记录执行屏幕的方法和处理该方法的电子设备
CN106814966A (zh) * 2017-01-24 2017-06-09 腾讯科技(深圳)有限公司 一种控制对象的方法及装置

Also Published As

Publication number Publication date
FI20095376A0 (fi) 2009-04-06
FI20095376A (fi) 2010-10-07
WO2010116028A3 (en) 2010-12-16

Similar Documents

Publication Publication Date Title
KR101545875B1 (ko) 멀티미디어 아이템 조작 장치 및 방법
CN101309311B (zh) 双向滑动式移动通信终端及其提供图形化用户界面的方法
JP5705131B2 (ja) 異種のタッチ領域を利用した電子機器の動作制御方法及び装置
EP1752865B1 (en) Mobile terminal having jog dial and controlling method thereof
KR100993064B1 (ko) 터치 스크린 적용 음원 재생 장치에서의 음원 선택 재생 방법
US20100306703A1 (en) Method, device, module, apparatus, and computer program for an input interface
WO2010116028A2 (en) Method for controlling an apparatus
KR20090043753A (ko) 터치스크린을 구비한 단말장치의 멀티태스킹 제어 방법 및장치
EP2538696A1 (en) Method and apparatus for multimedia content playback
KR20100081577A (ko) 휴대단말에서 오브젝트의 내비게이션 방법 및 장치
JP2007183914A (ja) コンテンツナビゲーション方法及びそのコンテンツナビゲーション装置
CN103391469A (zh) 移动终端及其控制方法
KR20090085470A (ko) 아이템 또는 바탕화면에서 복수의 터치방식을 감지하는터치 ui 제공방법 및 이를 적용한 멀티미디어 기기
CN110908582A (zh) 一种控制方法、触控笔及电子组件
EP2214174A2 (en) Apparatus and method for playing of multimedia item
EP2071443A2 (en) Method for controlling value of parameter
US20080318618A1 (en) Mobile communication device and method of controlling the same
US8185163B2 (en) Mobile communication device and method of controlling the same
JP2009276833A (ja) 表示装置および表示方法
CN106843903B (zh) 一种智能移动终端的用户行为模式应用方法和装置
CN103699303A (zh) 一种信息处理方法及电子设备
WO2010046541A1 (en) Method and device for controlling an application
KR100655928B1 (ko) 네비게이션 gui시스템 및 그 gui 구현방법
JP2010533916A (ja) 表示制御方法、その方法を使用する端末機及びその方法を記録した記録媒体
CN105930069A (zh) 一种输入法的切换方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10715911

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10715911

Country of ref document: EP

Kind code of ref document: A2