WO2011098675A1 - Procédé d'utilisation d'un dispositif, produit de programme informatique, progiciel et interface utilisateur - Google Patents

Procédé d'utilisation d'un dispositif, produit de programme informatique, progiciel et interface utilisateur Download PDF

Info

Publication number
WO2011098675A1
WO2011098675A1 PCT/FI2011/050131 FI2011050131W WO2011098675A1 WO 2011098675 A1 WO2011098675 A1 WO 2011098675A1 FI 2011050131 W FI2011050131 W FI 2011050131W WO 2011098675 A1 WO2011098675 A1 WO 2011098675A1
Authority
WO
WIPO (PCT)
Prior art keywords
functions
touchscreen
run
function
areas
Prior art date
Application number
PCT/FI2011/050131
Other languages
English (en)
Inventor
Panu Kause
Mikael Laine
Original Assignee
Ixonos Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ixonos Oyj filed Critical Ixonos Oyj
Publication of WO2011098675A1 publication Critical patent/WO2011098675A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements

Definitions

  • the present invention relates to a method for operating a device, in which
  • the invention also relates to a device, a computer program product, a computer program, and a user interface.
  • Electronic devices which are equipped with a touch-sensitive display, are known from the prior art.
  • at least part of the interaction between the device and the user is taken care of by means of finger contacts and/or movements, gestures directed to the touchscreen.
  • the contacts and movements can be created, for example, using one or more fingers, or also using a contact stylus.
  • Some examples of such devices with a touchscreen are mobile communication devices, PDAs, communicators, navigators, and media players.
  • applications can be run in so-called multi-tasking.
  • multi-tasking several applications can be run simultaneously. At least one of the applications being run is in an active operating state.
  • the other applications also being run are in operation, but in such a way that, even though they are not being shown at the time on the display of the device, it is nevertheless possible to move smoothly between them, by changing from one application to another.
  • Devices with a small, restricted display create a challenge for the mul ti - tasking of applications, in terms of the understandability and usability of the user interface. This applies particularly to the navigation between applications being actively run and their display to the user.
  • One way to implement this is to activate a special dashboard functionality from the device's user interface and, as a result, show simultaneously on the device's display all the applications that are currently active. This takes place with views that are substantially reduced in size, relative to the normal views of the applications, in order to show simultaneously a large group of applications being run. If many applications are being run, and if the display is small in size, or if its orientation is disadvantageous for a selected application, it can be difficult to distinguish applications from each other.
  • the activation of a special dashboard functionality can even be behind several finger-touch gestures, which, for its part, will affect the experience of using the functionality in question and the efficiency of its operation.
  • One known way of implementing this is to put the items' control areas on the file, using relatively precise limits, and to display the files in the basic view of the user interface.
  • the files can be, for example, Messages, Gallery, and Media.
  • the Media file can be divided into camera, video, tape- recorder, media player, and radio functions. When thus divided, the files and the manner of displaying them in the user interface become confusing. In extreme cases, functions that are, as such, useful may remain completely unused, if the user has only a superficial familiarity with the device's instruction manual.
  • an effective user inter- face makes it simple and easy for a user to navigate between the functions being run in the device's user interface. For example, when moving from one function to another, the user must understand how to navigate between the functions while the state of the device must be such that it is generally possible to switch from one function to another. The technical problem relating to this is how to permit effective navigation between functions . The ease and speed of navigation are directly linked to this effective navigation.
  • the present invention is intended to create a method for operating a device, which has an improved user experience and through which navigation in the user interface is simpler and faster than that of solutions according to the prior art.
  • the characteristic features of the method according to the inven- tion are presented in Claim 1.
  • the invention also relates to a device, the characteristic features of which are presented in Claim 10.
  • One implementation of the present invention is based on the observation that the traditional approach in navigation between the functions being run is inefficient for some users, in terms of the finger touches required in navigation, and also in moving between the functions being run.
  • the approach according to the invention, for navigating between functions is much more effective, in that switching from one function to another is very easy and the functions being run are expressed on the device's touchscreen in a representative and easily accessible manner, not least because it reflects, and is consistent with how people organize within hand's reach the most used items in their physical work environment - for example their desk, such as a computer, telephone, pens, etc.
  • the objectives of the invention are achieved in such a way that the functions being run on the device are arranged in virtual rows to be displayed on the touchscreen while navigation between the functions being run on the device takes place by scrolling the rows on the touchscreen.
  • switching from one function to another can then be performed using as few as only two finger control movements, by which changing between the functions being run demands only minimal effort by the user.
  • navigation can be performed directly from an function being run, i.e. from one currently displayed on the screen.
  • the navigation control operations are then directed to the screen area reserved for the function on the touchscreen.
  • Navigation can be initiated, for example, using a single simple finger control movement, which can be directed to any part whatever of the touchscreen. The more freely selectable touch location substantially improves the experience of using the device, as the navigation touch need no longer be directed to a specific precisely delimited area on the device's touchscreen.
  • the same finger contact that initiated navigation can be used to browse through several functions being run. Application browsing will not then need its own special finger contact in addition to the navigation-initiation contact.
  • the functions when navigating between functions being run, can be scrolled as a continuous flow in view on the touchscreen. Each function to be run will then be clearly visible in its main totality one at a time on the screen, so that a second finger touch can be used to select it as the application to be run.
  • a finger touch is used to stop naviga- tion and to concentrate the selected function on the screen and to permit working in the function in question, or to concentrate operations on the function in question. According to one embodiment, it is also possible to navigate very simply within the function itself.
  • control areas shown on the touchscreen and relating to the items arranged in the device can be organized in two areas on the touchscreen depending on each other.
  • the area can be located, for example, at the upper and lower edges of the screen, from where they can be shrinkably expanded to a task area arranged on the touchscreen and on top of a function being performed there.
  • the facts that the areas can be shrunk and expanded and that they are shown dependently on each other ensure that the areas being shown do not, for example, overlap each other.
  • the control areas of the items can be divided between the areas, so that the significance and nature of the items corresponding to the control areas can be easily perceived in terms of the device's operation. This will further improve the experience of using the device.
  • the invention permits rapid navigation between functions being run, using traditional touchscreen navigation gestures.
  • aspects of the invention include a computer-program product, a computer program, and a user interface, which, when operating in a device, permit the device to operate according to the operating principles of the invention described above.
  • the program can be an operating system, which can be seen by the user in the form of a user interface.
  • Figure 1 shows a schematic example of the device, as an extremely simplified diagram
  • Figure 2 shows a schematic diagram of one embodiment, when navigating between functions in multitasking
  • Figures 3a-3f show more detailed situation images from the device's touchscreen, when navigating from one function to another,
  • Figures 4a-4c show situation images from the device's touchscreen, when navigating within a function
  • Figures 5a-5c show one way of arranging the function areas in the device's user interface, and their operating logic
  • Figures 6a-6c show the activation in stages of an area containing device functions and content
  • Figures 7a-7c show the activation in stages of an area containing communications functions and content
  • Figures 8a-8c show the activation in stages of content and the queue of functions to be run in corresponding stages
  • Figures 9a-9c show a first example of the termination of a function, in stages
  • Figures 10a and 10b show a second example of the termination of a function, in stages
  • Figures 11a and lib show a third example of the termination of a function, in stages.
  • the term evice' refers to any electronic device whatever, which can process and display information in one way or another for the user of the device.
  • Particular applications of the invention are, for 10 example, in mobile devices.
  • Some examples of these are radiotelephones, mobile stations, smart phones, tablet devices, communicators, PDAs, personal calendars, navigators, and wireless information devices, such as media players.
  • Figure 1 shows a schematic and thus extremely simplified diagram of one device 10, in which the invention can be applied.
  • the device 10 includes processor means for performing two or more functions 15.1 - 15. n on the device 10, such as at least one processor unit 11.
  • the device 10 can include
  • the device 10 also includes display means 14.
  • the display means can include at least one display element, which in this case is a touch-sensitive display 14.
  • the device 10 includes detector means 19 arranged, for example, to detect control operations directed to the touchscreen 14, such as the location of the touched area on the touchscreen and the type of touch.
  • the device 10 includes detector means 19 arranged, for example, to detect control operations directed to the touchscreen 14, such as the location of the touched area on the touchscreen and the type of touch.
  • the device includes detector means 19 arranged, for example, to detect control operations directed to the touchscreen 14, such as the location of the touched area on the touchscreen and the type of touch.
  • the device 10 includes detector means 19 arranged, for example, to detect control operations directed to the touchscreen 14, such as the location of the touched area on the touchscreen and the type of touch.
  • 35 10 also includes control means 22 for the touchscreen 14, arranged, for example, to change the view on the touchscreen 14 according to the control operations detected by the detector means 19.
  • the device's 10 user interface together with its operating references (for example, icons or similar items in the areas 20 and 21) and functions 15.1 - 15. n that are being run on the device 10 at the time, are displayed on the touchscreen 14.
  • the operating references are control areas arranged on the screen 14, which when touched in a set manner create the function associated with each reference. In other words, touching the control area creates an electric signal in the screen 14, which controls the processor means 11 of the device and thus alters the operating state of the device 10.
  • the screen 14 can be oriented, for example, vertically in the operating position of the device 10, to that its height will be greater than its width.
  • the device 10 further also includes memory 12, which can be divided into several purposes.
  • the memory 12 can be divided into an application memory 12.1, in which the program code creating the functions arranged in the device 10 can be stored, and a content memory 12.2, in which the content of the device 10 is stored.
  • application memory 12.1 in which the program code creating the functions arranged in the device 10 can be stored
  • content memory 12.2 in which the content of the device 10 is stored.
  • the functions can be, for example, applications in the device 10 and in itself also the user-interface software of the device 10, through which the functions can be accessed and used.
  • Some examples of applications are communications applications, for example, a number selector, a text and/or multimedia-based message application (SMS, IM, email, chat) or PTT (Push-to- Talk) .
  • the applications can also be, for example, a calendar application, an address-book application, a media player, a web browser, input applications, various office applications, games, and miniature programs (for example, widgets) .
  • content can also be understood very widely.
  • It can be, for example, content concerning communications (contacts, SMS or instant messages, emails, stored on the device 10) and also various media content (for example, pieces of music, digital images, presentation graphics, and videos) , or documents (office-application documents or web content) .
  • the content can be, for example, various messages received from social-media arenas, or related input. Examples include various community services, blogs, discussion forums, and websites intended for distributing videos and images .
  • the memory 12 can include read-alter memory 13 reserved for the processor means 11, into which functions 15.1 - 15.n, together with content, can be loaded, for example, for multitasking.
  • read-alter memory 13 reserved for the processor means 11, into which functions 15.1 - 15.n, together with content, can be loaded, for example, for multitasking.
  • ways of arranging the multitasking of applications 15.1 - 15.n other than through a separate read- alter memory 13 will be obvious to one skilled in the art.
  • sub-totalities in the device 10 can be, for example, various input/output means, such as keys, a camera module, a microphone, a GPS module, a loudspeaker and/or one or several data- transfer modules, through which the device 10 can be used to communicate, for example, wirelessly in one or several selected data-communications networks . These modules are marked with the reference number 27 in Figure 1.
  • the device 10 is able to display, for example, the functions 15.1 - 15.n loaded into the read-alter memory 13 and run and performed using the processor 11.
  • the functions 15.1 - 15.n loaded into the read-alter memory 13 and run and performed using the processor 11.
  • at least one function being performed is shown in its entirety at a time on the touchscreen 14.
  • At least one of the functions 15.1 - 15.n is arranged to be run in the device 10, but two or more applications can also be run mainly simultaneously. It is then possible to speak of the multitasking of the functions 15.1 - 15.11.
  • the functions 15.1 - 15.n can be run in such a way that it is possible to switch between the functions 15.1 - 15. n from one function to another, without the running of the functions 15.1 - 15.n being actually terminated, in other words without interruption.
  • a function need not be started again from a so-called zero state in the traditional sense, instead from one function to another between functions is possible using simple commands .
  • Some of the functions 15.1 - 15.3 can be running as a default value the whole time, in such a way that they cannot be terminated by an action of the user. Such are, for example, the home view (connected timeline) 15.1, the text-input user interface 15.2, and the number-selection user interface 15.3.
  • the rest of the functions and tasks 15.4 - 15. n can be initiated for running in a manner described hereinafter.
  • the functions 15.1 - 15.n being run, at least one function 15.1 is performed, i.e. is open at a time.
  • performance of the function 15.1 refers to its active use, in which case operations, for example, are directed to the function 15.1 and in which case the function 15.1 can be visible on the screen 14 of the device 10.
  • active performance of a function can also mean that the function being performed is not necessarily visible on the device's 10 screen 14, but is, for example, available to the user in one way or another.
  • An example of such a situation is a music-player function, which can play a piece in the background, and an email application, which can be on the display 14 at the same time as the piece is played and which is used to read or write messages.
  • Figures 2 - 11 show situation images to describe various embodiments of the invention from the device's 10 touchscreen 14. It can be generally said of the situation images that numbered finger-control examples are also shown in them, reference to which will be made later in the description. In addition, numbering fitted into a drop shape is arranged outside each situation image, and shows the finger-control gesture leading to the situation image in question.
  • Figure 2 shows a schematic diagram of navigation between func- tions 15.1 - 15. n being multitasked by the processor 11 and displayed on the screen 14, as well as the arrangement on the device's 10 screen 14 of the function 15.1 being currently performed.
  • most of the device's 10 touchscreen 14 can be reserved for one function 15.1 being performed at a time.
  • the function 15.1 can be shown, for example, in the central area of the device's 10 screen 14, which can also be referred to as the task area. In that case, mainly the entire display surface can be effectively utilized for the use of the function 15.1 being performed and it will always be clear to the user, to which function their operations on the screen 14 are directed.
  • reserving the screen 14 mainly for the function 15.1 being used permits a logical way of navigating between the functions 15.1 - 15. n being run, as will become clearer in the following.
  • the functions 15.1 - 15. n run on the device 10 are arranged, for example by the processor means 11, into a virtual row 26 to be shown on the touchscreen 14.
  • the view on the screen 14 is controlled using the control means 22, in such a way that, when navigating between the functions 15.1 - 15. n being run by the processor means 11, the functions are shown as a moving row formation on the screen 14.
  • navigation between the functions 15.1 - 15.n being run is possible by scrolling this virtual row 26 on the device's 10 screen 14.
  • the view on the device's 10 screen 14 can then be imagined as changing, with the device frame 10 according to Figure 2 moving, for example, as a continuous movement towards the right while the unified row 26 formed by the functions 15.1 - 15.n remains stationary.
  • Scrolling the view of the screen 14 is easy and quick, as will be described later in greater detail.
  • the virtual row arranged on the touchscreen 14 can be scrolled for navigation purposes directly from the currently open function 15.1 displayed on the screen 14, in other words from the screen area reserved for the function 15.1. There is therefore no special need to move away at
  • the gestures 2a depict possible navigation touches (flicks) .
  • the virtual row 26 can be scrolled, for example, discretely horizontally, from function to function, or also as a continuous unbroken flow-past of the functions 15.1 - 15.n.
  • the functions 15.1 - 15.n that are active and being run at the time on the device 10 can be browsed one-dimensionally in a simple and
  • Scrolling requires only a small degree of precision, because the functions 15.1 - 15. n appear on the display 14 in a precisely-linked sequence and in a one-dimensional formation, which the user can easily understand and from which the next function desired can be easily opened from the
  • the functions 15.1 - 15.n being run at any time can be arranged, on the basis of a set criterion, in a virtual endless row 16 of functions 15.1 - 35 15.n, in which navigation between the functions 15.1 - 15.n is performed.
  • a set criterion is the frequency of use of the functions 15.1 - 15.n, i.e. how often they are used.
  • the functions 15.1 - 15.n in the virtual row 26 are right next to each other and thus when scrolling they appear on the display 14 mainly one after the other, without any substantial gaps between the function frames, but nevertheless in such a way that the functions 15.1 - 15. n can be distinguished from each other.
  • a new function 15. n When a new function 15. n is initiated, it can be placed, for example, at the end of the function queue 26, or in such a way as to take into account the previous use-frequency history of the function 15.n.
  • the functions 15.1 - 15.n in the virtual row 26 can be presented as a kind of card animation. If the virtual row 26 is endless, it can be effectively navigated in opposite directions, as the gestures 2a of Figure 2 show. Two finger touches (the finger gesture 2b in function 15.5 of Figure 2) can also be used to return to the home view 15.1, from any function whatever. It can be said that the finger gestures 2a and 2b are universal gestures, which create their defined function independently of the location where they are performed.
  • Figures 3a - 3f show example situation images in greater detail of navigation from one function to another on the device's 10 touchscreen 14.
  • the function 15.3 being currently run can be transferred to background running using a circular finger gesture 2.
  • navigation between the functions 15.1 - 15. n is also possible without finger gesture 2.
  • the result of the gesture 2 is an operating state, in which navigation between the functions is possible ( Figure 3b) .
  • the virtual row 26 when navigating between the functions 15.1 - 15.n being run, can be scrolled kxnetically horizontally as a continuous flow of the functions 15.1 - 15. n being run, forwards from function to function ( Figures 3c and 3d) . Scrolling is now initiated by an oriented finger tap 3 ( Figure 3c) .
  • the finger tap 3 is a control operation on the touchscreen 14 for changing the function 15.1 being shown, more generally for navigating between the functions 15.1 - 15.n being run.
  • the finger tap 3 is detected by the detection means 19, which are arranged to detect control operations directed to the touchscreen 14, which in this case is the scrolling of the function row 26 on the screen 14.
  • the finger tap 3 indicates the direction of the scrolling, in which it is desired for the function row 26 to progress on the screen 14 (towards the right) .
  • the finger tap 3 could also indicate the scrolling direction in such a way that it would be used to show the direction in which it is wished to move along the function row 26 (for example, in the direction of the tap 3, i.e. towards the right, in which case the function row 26 would appear to progress towards the left on the screen 14) .
  • Scrolling In the finger tap 3, it is as if the surface of the screen 14 were to be brushed with a short movement of the tip of the finger (flick) .
  • a finger tap 3 indicating the direction of scrolling is also extremely effective.
  • a single finger tap 3 is used not only to initiate navigation, but also to perform navigation between the functions being run (running the applications past on the screen 14) and thus to make the usability of the device 10 more efficient.
  • using a single tap gesture 3 at least two functions are obtained, to that navigation is simple and rapid.
  • the functions 15.1 - 15. n being currently multitasked by the processor 11 move in sequence on the screen 14 as a continuous stream of functions. Each function being run appears in turn in the queue 26 as clearly the main totality appearing on the screen 14. It can then be selected by a simple finger touch 4 ( Figure 3e) , which once again is detected by ⁇ the detection means 19.
  • the control means 22 change the function 15.1 shown on the touchscreen 14, more generally the view on the touchscreen 14, on the basis of the navigation operations.
  • the control means 22 are thus arranged to scroll the row 26 on the touchscreen 14 on the basis of the detection. Because the functions 15.1 - 15. n are shown in a relatively large size during scrolling on the screen 14, in their essential performance size, they can be easily selected despite the continuous change of the view.
  • the function 15.2 once again reserves most of the task area of the display 14 ( Figure 3f ) .
  • navigation can be stopped by a finger touch 4.
  • the finger touch 4 is also detected by the detection means 19. It centres the selected function 15.2 on the screen 14 and permits working in the function 15.2 in question, or the directing of operations to the function 15.2 in question.
  • navigation between applications requires, at a minimum, only two finger touches (gestures 3 and 4) , which is a substantial improvement in multitasking navigation.
  • Figures 4a - 4c show situation images from the device's 10 touchscreen 14, illustrating navigation within a function 15.2.
  • a finger gesture 2 is used to move from the home view 15.1, being currently shown on the screen 14 in multitasking, to the next function, which is now a music player 15.2 ( Figure 4b) .
  • the function 15.2 being performed at the time and shown on the screen 14 can also be navigated in very many different ways. According to one embodiment, it is possible to navigate in the function 15.2 by scrolling vertically in the view of the touchscreen 14, when the view on the screen 14 will change correspondingly. This is indicates in Figures 4a and 4c by a downwards-facing finger gesture equipped with an arrow (for example, drag or flick), which is without a gesture num- ber.
  • the function 15.2 can rotate virtually on the touchscreen 14, when a new view is made visible from it.
  • One way to rotate the function 15.2 is to use animation to invert it.
  • Both sides of the card 15.2 can be scrolled vertically.
  • the finger gesture 1 it is possible to return from both sides 2a, 2b of the function card 15.2 to the home view 15.1 of the user interface ( Figure 4a) .
  • the function to be performed is a music player 15.2, which has two sides ( Figures 4b and 4c) .
  • a music library On the other side of the card ( Figure 4c) is a music library, which can be organized in functions that are, as such, known.
  • Figures 5a - 5c show one way to arrange the areas 20, 21 re- served for the functions and content, more generally the items of the device 10, on the user interface, more generally the screen 14 and next to each figure is its operating logic, from which the functions and content can be easily and logically found and initiated.
  • the references i.e. in a technical sense the control areas, arranged in the user interface, relating to the functions of the device 10 and the content arranged in the device 10 can be arranged, in terms of the usability of the device 10 and also of navigation in the functions 15.1 - 15. n being run, effectively and clearly.
  • n image series 6 and 7 concerning the functions and/or content arranged in the device 10 can be arranged in two areas 20, 21 to be shown on the touchscreen 14.
  • the areas 20, 21 can be shown on the touchscreen 14 independently of each other, for example, in such a way that they never overlap each other.
  • the areas 20, 21 can be arranged in many different ways on the touchscreen 14.
  • the task area 23 'HomeScreen' of the user interface which is reserved for example for the function being currently performed, is arranged in the centre of the touchscreen 14. More generally, reference can be made to the performance area, in which things happen and in which are the currently running tasks, activities, open applications, arriving notifications.
  • a shrinkable and expandable first and a second area 20, 21 are arranged on top of the task area 23.
  • the default location of the areas 20, 21 can be at the opposite extreme edges of the user interface, more generally the touchscreen 14, such as the upper and lower parts of the screen 14.
  • the areas 20, 21 can be easily accessed and controlled, because they are located in a fixed manner relative to the task area 23.
  • the areas 20, 21 can be always visible due to their location and size.
  • the areas 20, 21 in their default locations the areas 20, 21 can be shown in the vertical direction of the screen 14 as narrow horizontal bars, without actual control areas, as happens in, for example, Figures 2, 4, 5, and 8.
  • the areas 20, 21 in their default locations the areas 20, 21 can also be shown as horizontal bars at the height of one control- area row, as happens in, for example, Figures 3, 6, 7, and 9 - 11.
  • there can be control areas for example, for user-defined or most-used items in the areas 20, 21 in question.
  • the areas 20, 21 in the upper and lower parts of the screen 14 can be expanded and shrunk, for example, steplessly on top of the task area 23 of the user interface.
  • a roller-blind analogy can be applied in the expansion and shrinking.
  • the size of the visible parts of the areas 20, 21 on the display 14 can be altered steplessly with gesture references 1 - 3 for example by touching and dragging the edge of the areas 20, 21 vertically backwards and forwards (drag finger gesture) , or by an oriented finger tap (flick finger gesture) .
  • gesture references 1 - 3 for example by touching and dragging the edge of the areas 20, 21 vertically backwards and forwards (drag finger gesture) , or by an oriented finger tap (flick finger gesture) .
  • drag finger gesture drag finger gesture
  • oriented finger tap flick finger gesture
  • Figures 6a - 6c show the arrangement of the control areas concerning the items of the device properties, for example, functions and content 16.1 - 16.n, in the area 20 and their activation in stages.
  • Figures 7a - 7c show the arrangement of the control areas concerning the items 17.1 - 17. n of the communications properties, for example, contacts and other communications content, in the area 21 and their activation in stages.
  • the control areas of the functions and content references of the device 10 can be arranged in such a way that, in the first area 20 arranged on the touchscreen 14 are control areas 16.1 - 16. n, 18.1 - 18. n concerning the device-specific items, i.e.
  • references to the functions and content arranged in the device 10, which can be selected and activated by the user are, for example, the applications 16.1 -16.n and the content 18.1 - 18. n in the memory of the device, such as images, music, and media.
  • control areas 17.1 - 17.n concerning the communications- specific items, for example contacts, and in general content relating to communications stored in the device 10.
  • the contacts 17.1 - 17. n can be, for example, user-defined contact information, which have control areas shown as icons.
  • control areas 16.1 - 16. n, 17.1 - 17. n, 18.1 - 18.n in the first and second areas 20, 21 can be arranged, for example, according to their frequency of use. In that case, the control
  • control areas 20, 21 10 areas of the most-frequently-used items can come first in the control areas 20, 21.
  • control areas of contacts 17.1, 17.2 Figure 7c).
  • control areas 20, 21 of the items is, for example, to arrange them, for example, in alphabetical order.
  • the arrangement can be activated from a selection button 28 shown on the screen 14 (figure series 6 and 7) .
  • the location of the control areas 16.1 - 16. n, 17.1 - 17. n in the areas 20, 21 can change adaptively relative to each other, depending on relative frequency of use of the items of the definitions of the control areas.
  • the device 10 can adapt to, 30 or learn from the user's operating patterns in such a way that the control areas of frequently used items will be easier to locate, for example, at the beginning of the area 20, 21, or in the default location bars of the areas.
  • control areas 16.1, 16.5, 17.1, 17.2 of the most used items of the areas 20 ,21 can also be distinguished visually from the control areas 16.2 -
  • control area 16.1, 16.5, 17.1, 17.2 on the screen 14, which will appear, for example, as a default value, or activated on top of the control area 16.1, 16.5, 17.1, 17.2 of one or more most-usually-selected item.
  • it is precisely the most-usually-selected name or icon that it is in the default location (e.g., in one embodiment, the initial presentation of the names or icons, once the area containing them has first been opened) .
  • the control areas 16.1, 16.5, 17.1, 17.2 can also be given a larger size relative to their default presentation size, as has been done here.
  • the information 24 and/or functions 25 relating to an item defined by the control area 16.1, 16.5, 17.1, 17.2 can be arranged to visual distinction. In the web browser 16.1 of Figure 6c, these are link references 25 and weather information
  • Figures 8a - 8c show examples of the activation of content 18.2 in stages and the queues 26 of the functions being run, in corresponding stages.
  • the home view 15.1 is on the device's 10 screen 14 and a finger gesture 2 is used to expand the area 20 on top of the home view.
  • the area 20 has been expanded.
  • the player application 16.4 By means of a finger gesture 3 (tap), the player application 16.4, and even more particularly the related content 18.2, is opened.
  • the player application 16.4 moves to the end of the list 26 of functions being run.
  • the player application 16.4 is open and is used to play a piece of music 18.2 activated as content.
  • the area 21 it is possible to use a finger touch to activate a selected contact card 17.2 (Figure 7c), from which the information relating to the person can be seen and the currently available ways 25 of messaging with the person.
  • displayed on the card 17.2 can be a communications log, the most recent communications with the contact in question together with their contents, and status updates from the contact in question, as well as content distributed by them and shared with them. If it is wished, for example, to phone a contact, the control area with the handset reference is touched, thus opening the call application in the task area 23. At the same time, the area 21 can shrink to its default location in the upper part of the screen 14.
  • Figures 9a - 9c show a first example of the termination of a function 15.2 in stages, in which the function 15.2 is removed from the multitasking queue. It is possible to terminate the function 15.2 once one is in the navigation view ( Figure 9a), or also without the actual navigation view (for example, using a drag gesture or two finger flick gestures 3 according to Figure 2). According to Figure 9a, the function 15.2 can be terminated, for example, by an oriented tap gesture 2 (flick) . In that case, a finger flick is made at the location of the function 15.2 in the direction of the default location of the function 15.2 in the device's 10 user interface.
  • oriented tap gesture 2 flick
  • Figures 10a and 10b show a second example of the termination of 5 a function 15.2 in stages.
  • the function is a media player 15.2 and the content is a piece of music or an album being played on it.
  • a downwards directed termination gesture 2 ( Figure 10a) now causes a menu to appear on the screen 14 ( Figure 10b) . Commands suitable to the content in question and 10 the flicking direction can then be performed. Because the content flicking direction was towards the area 20, the alternatives therefore relate to the device-function context, which in this case means, for example, file-management-type alternatives .
  • Figures 11a and lib show a third example of the termination of a function 15.2 in stages.
  • the function is a media player 15.2 and the content is a piece of music or an album being played on it.
  • the termination gesture 2 Figures 11a and lib
  • the invention relates not only to a method, but also to a device. Every 35 operation relating to the method can equally concern the device 10 and can be created by using the processor 11 of the device 10 to perform one or more program codes corresponding to the operation in question.
  • the invention also relates to a computer program and computer-program product.
  • the product comprises computer- program code means stored on a storage medium readable by the computer 10, which code means are arranged to perform all the stages of the method when running the program in the computer 10.
  • the computer program comprises computer- program code means, which are arranged to perform all the stages of the method when running the program in the computer 10.
  • the invention also relates to a user inter- face, in which two or more functions 15.1 - 15.n are arranged to be run and at least one function 15.1 of the functions 15.1 - 15.n being run is arranged to be performed at a time, and to be navigated between the functions 15.1 - 15.n being run when it is wished to change the function being performed.
  • the func- tions 15.1 - 15. n being run are arranged to be organized in a virtual row 26 to be shown in the user interface and between the functions 15.1 - 15. n is arranged to be navigated by scrolling the view of the user interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Cette invention concerne un procédé d'utilisation d'un dispositif selon lequel deux fonctions (15.1 - 15.n) ou plus sont utilisées sur le dispositif. Au moins une (15.1) des fonctions utilisées est sollicitée à un moment, la navigation entre les fonctions se faisant au moyen de l'écran tactile (14) du dispositif. Avec ce procédé, les fonctions utilisées sont agencées selon une rangée virtuelle (26) affichée sur l'écran, la navigation entre les fonctions se faisant par déroulement de ladite rangée sur l'écran. L'invention concerne également un dispositif, un produit de programme informatique, un logiciel et une interface utilisateur.
PCT/FI2011/050131 2010-02-12 2011-02-11 Procédé d'utilisation d'un dispositif, produit de programme informatique, progiciel et interface utilisateur WO2011098675A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20105143A FI20105143A (fi) 2010-02-12 2010-02-12 Menetelmä laitteen käyttämiseksi, laite, tietokoneohjelmatallenne, tietokoneohjelma ja käyttöliittymä
FI20105143 2010-02-12

Publications (1)

Publication Number Publication Date
WO2011098675A1 true WO2011098675A1 (fr) 2011-08-18

Family

ID=41727691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2011/050131 WO2011098675A1 (fr) 2010-02-12 2011-02-11 Procédé d'utilisation d'un dispositif, produit de programme informatique, progiciel et interface utilisateur

Country Status (2)

Country Link
FI (1) FI20105143A (fr)
WO (1) WO2011098675A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11740727B1 (en) 2011-08-05 2023-08-29 P4Tents1 Llc Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168368A1 (en) * 2007-01-07 2008-07-10 Louch John O Dashboards, Widgets and Devices
WO2009097555A2 (fr) * 2008-01-30 2009-08-06 Google Inc. Notification d'événements de dispositif mobile
KR20100010072A (ko) * 2008-07-22 2010-02-01 엘지전자 주식회사 이동 단말기의 멀티태스킹을 위한 사용자 인터페이스제어방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168368A1 (en) * 2007-01-07 2008-07-10 Louch John O Dashboards, Widgets and Devices
WO2009097555A2 (fr) * 2008-01-30 2009-08-06 Google Inc. Notification d'événements de dispositif mobile
KR20100010072A (ko) * 2008-07-22 2010-02-01 엘지전자 주식회사 이동 단말기의 멀티태스킹을 위한 사용자 인터페이스제어방법

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11740727B1 (en) 2011-08-05 2023-08-29 P4Tents1 Llc Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback

Also Published As

Publication number Publication date
FI20105143A (fi) 2011-08-13
FI20105143A0 (fi) 2010-02-12

Similar Documents

Publication Publication Date Title
US20230359349A1 (en) Portable multifunction device with interface reconfiguration mode
US11947782B2 (en) Device, method, and graphical user interface for manipulating workspace views
US8214768B2 (en) Method, system, and graphical user interface for viewing multiple application windows
EP2570909B1 (fr) Dispositif électronique, procédé de synchronisation et produit de programme informatique
US9665178B2 (en) Selective inbox access in homescreen mode on a mobile electronic device
KR101343479B1 (ko) 전자 디바이스 및 이의 제어 방법
US20100088632A1 (en) Method and handheld electronic device having dual mode touchscreen-based navigation
US20140235222A1 (en) Systems and method for implementing multiple personas on mobile technology platforms
US20120180001A1 (en) Electronic device and method of controlling same
EP4235385A2 (fr) Dispositif électronique et procédé d'affichage d'informations en réponse à un geste
KR20110113777A (ko) 정보 디스플레이
EP2629180A2 (fr) Terminal mobile doté d'un objet graphique à facettes multiples et procédé permettant d'effectuer une opération de commutation d'affichage
WO2012097385A2 (fr) Dispositif électronique et procédé d'affichage d'informations en réponse à un geste
KR20130093043A (ko) 터치 및 스와이프 내비게이션을 위한 사용자 인터페이스 방법 및 모바일 디바이스
CA2806348A1 (fr) Dispositif tactile et son procede de commande de dossiers par effleurement
WO2007076226A1 (fr) Défilement de liste au moyen de symboles d'index adjacents flottants
JP5254399B2 (ja) 表示装置、ユーザインタフェース方法及びプログラム
EP2849045A2 (fr) Méthode et appareil pour commander une application en utilisant des touches ou une combinaison de celles-ci
KR20120132663A (ko) 캐러셀형 사용자 인터페이스 제공 방법 및 장치
JP5943856B2 (ja) 多面的なグラフィック・オブジェクトを有する携帯端末及び表示切替方法
WO2011098675A1 (fr) Procédé d'utilisation d'un dispositif, produit de programme informatique, progiciel et interface utilisateur
CN112148172A (zh) 操作控制方法及装置
CN109656460A (zh) 提供键盘的可选择的键的电子设备和方法
CN113196220A (zh) 用于为计算设备组织和调用命令的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11741948

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11741948

Country of ref document: EP

Kind code of ref document: A1