WO2014207288A1 - User interfaces and associated methods for controlling user interface elements - Google Patents

User interfaces and associated methods for controlling user interface elements Download PDF

Info

Publication number
WO2014207288A1
WO2014207288A1 PCT/FI2013/050689 FI2013050689W WO2014207288A1 WO 2014207288 A1 WO2014207288 A1 WO 2014207288A1 FI 2013050689 W FI2013050689 W FI 2013050689W WO 2014207288 A1 WO2014207288 A1 WO 2014207288A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
dragging
hover
user
stylus
Prior art date
Application number
PCT/FI2013/050689
Other languages
French (fr)
Inventor
Jari Olavi SAUKKO
Leo Mikko Johannes KÄRKKÄINEN
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/FI2013/050689 priority Critical patent/WO2014207288A1/en
Publication of WO2014207288A1 publication Critical patent/WO2014207288A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Definitions

  • the present disclosure relates to user interfaces and user interface elements, associated methods, computer programs and apparatus.
  • Certain disclosed aspects/embodiments relate to portable electronic devices, in particular, so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use).
  • Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs), mobile telephones, smartphones and other smart devices, and tablet PCs.
  • PDAs Personal Digital Assistants
  • mobile telephones smartphones and other smart devices
  • tablet PCs tablet PCs.
  • the portable electronic devices/apparatus may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions.
  • audio/text/video communication functions e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions
  • interactive/non-interactive viewing functions e.g. web-browsing, navigation, TV/program viewing functions
  • music recording/playing functions
  • a user interface may enable a user to interact with an electronic device, for example, to open applications using application icons, enter commands, to select menu items from a menu, or to enter characters using a virtual keypad.
  • an apparatus comprising:
  • a memory including computer program code, the memory and the computer program code configured, with the processor, to cause the apparatus to perform at least the following:
  • dragging hover gesture interaction corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • the dragging hover gesture provides for corresponding dragging of the one or more selected user interface elements.
  • the apparatus may be configured to enable selection of one or more user interface element in response to detecting a lift gesture interaction with at least one of the one or more selected user interface elements.
  • the lift gesture interaction may comprise a contact interaction with the at least one user interface element using the stylus followed by lifting the stylus away from the at least one user interface element.
  • the lift gesture interaction may comprise moving the stylus away from the at least one user interface element (e.g. with or without an initial contact interaction with the screen). That is, if the plane of the screen lies in an x-y plane in an orthogonal coordinate system, the lift gesture interaction may be aligned with the z-axis of the coordinate system. It will be appreciated that for embodiments with non-planar screens the lift gesture may be away from the screen.
  • the predetermined hover range may comprise a range of distances above the screen (e.g. between 0.1 and 3cm in the z-direction above the screen). It will be appreciated that a hover gesture interaction may be considered to exclude contact interactions (i.e. interactions where the stylus is touching the graphical user interface).
  • the selection dragging hover gesture interaction may be enabled by configuring the graphical user interface to be in a dragging mode.
  • the apparatus may be configured to enable the dragging mode of the graphical user interface in response to the user interacting with a particular dragging mode user interface control element (e.g. a virtual 'glue pot') displayed on the graphical user interface.
  • a particular dragging mode user interface control element e.g. a virtual 'glue pot'
  • the selection dragging hover gesture interaction may be enabled when a particular application is active.
  • the selection dragging hover gesture interaction may be enabled by the operating system (e.g. it may be available to multiple different applications and/or for user interface elements displayed on a home screen or in a locked mode).
  • the apparatus may be configured to enable dragging of a selected class of user interface elements, the selected class being an associated subset of the graphical user interface elements available for selection (e.g. allowing mobile telephone numbers to be dragged from a list of telephone numbers comprising mobile and land line telephone numbers).
  • the apparatus may be configured to enable the selected user interface elements to return to the positions in which they were selected in response to detecting the ceasing of the dragging hover gesture.
  • the apparatus may be configured, in response to detecting the ceasing of the dragging hover gesture, to enable the selected user interface elements to be positioned corresponding to the position of the stylus when the dragging hover gesture was detected to have ceased.
  • the apparatus may be configured to enable ceasing of the dragging hover gesture in response to one or more of:
  • the dragging termination gesture may be a particular waving action (e.g. a gentle back and forth or up and down motion), a particular wagging action (e.g. a jerky to and fro or up and down motion), or a particular flicking action (e.g. a light sharp jerky stroke or movement).
  • the apparatus/device may be configured to provide auditory and/or haptic feedback when the user is interacting with the user interface elements (e.g. to provide feedback when the one or more user interface elements are selected; being dragged; and/or when a dragging termination gesture is provided).
  • a user interface element may be a menu item, an icon, a tile, drawing user interface elements (e.g. shapes, lines or arrows), or a page in an e-book.
  • a stylus may comprise: a finger; a gloved finger; a hand; a gloved hand; a stylus tool; a pen; a pencil; a mechanical stylus; a substantially tubular object; and a substantially cylindrical object.
  • the apparatus may be a portable electronic device, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a server, a non-portable electronic device, a desktop computer, a monitor, a television, an automated teller machine (ATM), a server, a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same.
  • ATM automated teller machine
  • server a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same.
  • a method comprising:
  • dragging hover gesture interaction corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • a computer program comprising computer program code, the computer program code being configured to perform at least the following: enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • an apparatus comprising:
  • means for enabling configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • an apparatus comprising:
  • an enabler configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • the present disclosure includes one or more corresponding aspects, embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation.
  • Corresponding means and corresponding function units e.g. a determiner, an enabler for performing one or more of the discussed functions are also within the present disclosure.
  • Corresponding computer programs (which may or may not be recorded on a carrier, such as a CD or other non-transitory medium) for implementing one or more of the methods disclosed herein are also within the present disclosure and encompassed by one or more of the described example embodiments.
  • Figure 1 depicts an example apparatus embodiment comprising a number of electronic components, including memory and a processor;
  • Figure 2 depicts an example apparatus embodiment comprising a number of electronic components, including memory, a processor and a communication unit;
  • Figure 3 depicts an example apparatus embodiment comprising a number of electronic components, including memory and a processor;
  • Figure 4a-4f depicts an example embodiment wherein the user is interacting with an application
  • Figure 5a-5c depicts a further example embodiment wherein the user is interacting with a home screen
  • Figure 6a-6c depicts a further example embodiment wherein the user is interacting with multiple electronic devices
  • Figures 7a-7b illustrate an example apparatus in communication with a remote server/cloud
  • Figure 8 illustrates a flowchart according to an example method of the present disclosure
  • Figure 9 illustrates schematically a computer readable medium providing a program.
  • an electronic device it is common for an electronic device to have a user interface (which may or may not be graphically based) to allow a user to interact with the device to enter and/or interact with information.
  • a user interface which may or may not be graphically based
  • the user may use a keyboard user interface to enter text, or use icons to open applications.
  • Some user interfaces include displays, such as touch screens, which can display information to the user.
  • new user interface elements may be provided to notify the user that new voicemail has become available, or a pop-up window may be provided to indicate that a software update should be downloaded.
  • a pop-up window may be provided to indicate that a software update should be downloaded.
  • providing too much information in the form of additional user interface elements may cause the graphical user interface to be come cluttered.
  • the present disclosure relates to enabling, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus. This may allow the user to better and more intuitively control the position of the user interface elements provided on a display of an electronic device.
  • the user may control the position of the user interface elements with, for example, a finger or a thumb.
  • This may be particularly advantageous for portable electronic devices where multi-touch interactions (such as a pinch gesture) may be more difficult to provide as the user is generally also holding the device. That is, certain embodiments may allow the user to drag user interface elements using the same hand that is holding the device.
  • dragging hover gesture interaction may be intuitive as it replicates the effect of physical glue. That is, similar to the real world interaction where a user might use a sticky stylus (e.g. one covered with glue) to lift objects and drag them to the desired position, presently disclosed embodiments may allow a user to 'lift' user interface elements (i.e. by selecting them using a lift gesture) and drag them to a desired position.
  • a sticky stylus e.g. one covered with glue
  • feature number 1 can also correspond to numbers 101 , 201 , 301 etc. These numbered features may appear in the figures but may not have been directly referred to within the description of these particular embodiments. These have still been provided in the figures to aid understanding of the further embodiments, particularly in relation to the features of similar earlier described embodiments.
  • Figure 1 shows an apparatus 101 comprising memory 107, a processor 108, input I and output O.
  • the apparatus 101 is an Application Specific Integrated Circuit (ASIC) for a portable electronic device with a touch sensitive display.
  • ASIC Application Specific Integrated Circuit
  • the apparatus 101 can be a module for such a device, or may be the device itself, wherein the processor 108 is a general purpose CPU of the device and the memory 107 is general purpose memory comprised by the device.
  • the input I allows for receipt of signalling to the apparatus 101 from further components, such as components of a portable electronic device (like a touch-sensitive display) or the like.
  • the output O allows for onward provision of signalling from within the apparatus 101 to further components.
  • the input I and output O are part of a connection bus that allows for connection of the apparatus 101 to further components.
  • the processor 108 is a general purpose processor dedicated to executing/processing information received via the input I in accordance with instructions stored in the form of computer program code on the memory 107.
  • the output signalling generated by such operations from the processor 108 is provided onwards to further components via the output O.
  • the memory 107 (not necessarily a single memory unit) is a computer readable medium (solid state memory in this example, but may be other types of memory such as a hard drive, ROM, RAM, Flash or the like) that stores computer program code.
  • This computer program code stores instructions that are executable by the processor 108, when the program code is run on the processor 108.
  • the internal connections between the memory 107 and the processor 108 can be understood to, in one or more example embodiments, provide an active coupling between the processor 108 and the memory 107 to allow the processor 108 to access the computer program code stored on the memory 107.
  • the input I, output O, processor 108 and memory 107 are all electrically connected to one another internally to allow for electrical communication between the respective components I, O, 108, 107.
  • the components are all located proximate to one another so as to be formed together as an ASIC, in other words, so as to be integrated together as a single chip/circuit that can be installed into an electronic device. In other examples one or more or all of the components may be located separately from one another.
  • Figure 2 depicts an apparatus 201 of a further example embodiment, such as a mobile phone.
  • the apparatus 201 may comprise a module for a mobile phone (or PDA or audio/video player), and may just comprise a suitably configured memory 207 and processor 208.
  • the apparatus in certain embodiments could be a portable electronic device, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a server, a non-portable electronic device, a desktop computer, a monitor, a server, a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same
  • the example embodiment of figure 2 comprises a display device 204 such as, for example, a Liquid Crystal Display (LCD) or touch-screen user interface.
  • the apparatus 201 of figure 2 is configured such that it may receive, include, and/or otherwise access data.
  • this example embodiment 201 comprises a communications unit 203, such as a receiver, transmitter, and/or transceiver, in communication with an antenna 202 for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of networks.
  • This example embodiment comprises a memory 207 that stores data, possibly after being received via antenna 202 or port or after being generated at the user interface 205.
  • the processor 208 may receive data from the user interface 205, from the memory 207, or from the communication unit 203. It will be appreciated that, in certain example embodiments, the display device 204 may incorporate the user interface 205.
  • these data may be outputted to a user of apparatus 201 via the display device 204, and/or any other output devices provided with apparatus.
  • the processor 208 may also store the data for later use in the memory 207.
  • the memory 207 may store computer program code and/or applications which may be used to instruct/enable the processor 208 to perform functions (e.g. read, write, delete, edit or process data).
  • Figure 3 depicts a further example embodiment of an electronic device 301 , such as a tablet personal computer, a portable electronic device, a portable telecommunications device, a server or a module for such a device, the device comprising the apparatus 101 of figure 1 .
  • the apparatus 101 can be provided as a module for device 301 , or even as a processor/memory for the device 301 or a processor/memory for a module for such a device 301.
  • the device 301 comprises a processor 308 and a storage medium 307, which are connected (e.g. electrically and/or wirelessly) by a data bus 380.
  • This data bus 380 can provide an active coupling between the processor 308 and the storage medium 307 to allow the processor 308 to access the computer program code.
  • the components (e.g. memory, processor) of the device/apparatus may be linked via cloud computing architecture.
  • the storage device may be a remote server accessed via the internet by the processor.
  • the apparatus 101 in figure 3 is connected (e.g. electrically and/or wirelessly) to an input/output interface 370 that receives the output from the apparatus 101 and transmits this to the device 301 via data bus 380.
  • Interface 370 can be connected via the data bus 380 to a display 304 (touch-sensitive or otherwise) that provides information from the apparatus 101 to a user.
  • Display 304 can be part of the device 301 or can be separate.
  • the device 301 also comprises a processor 308 configured for general control of the apparatus 101 as well as the device 301 by providing signalling to, and receiving signalling from, other device components to manage their operation.
  • the storage medium 307 is configured to store computer code configured to perform, control or enable the operation of the apparatus 101.
  • the storage medium 307 may be configured to store settings for the other device components.
  • the processor 308 may access the storage medium 307 to retrieve the component settings in order to manage the operation of the other device components.
  • the storage medium 307 may be a temporary storage medium such as a volatile random access memory.
  • the storage medium 307 may also be a permanent storage medium such as a hard disk drive, a flash memory, a remote server (such as cloud storage) or a nonvolatile random access memory.
  • the storage medium 307 could be composed of different combinations of the same or different memory types.
  • Figures 4a - 4d depicts a user holding an example embodiment comprising a portable electronic communications device 401 , e.g. such as a mobile phone, with a user interface comprising a touch screen user interface 405, 404, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
  • a portable electronic communications device 401 e.g. such as a mobile phone
  • a user interface comprising a touch screen user interface 405, 404, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
  • data e.g. emails, textual messages, phone calls, information corresponding to web pages.
  • the user is using an augmented reality application to view the surrounding scene.
  • the augmented reality application is configured to display live image data collected by a camera (not shown) of the electronic device 401 and display it on a touch sensitive screen 404, 405.
  • the augmented reality application is also configured to overlay place user interface element icons 421 a-421 d on the displayed image, the place user interface element icons 421 a-421 d being associated with a place of interest (e.g. a theatre, shop or train station) and positioned on the screen with respect to the image so as to indicate the direction of the place of interest from the point of view of the user.
  • place icons corresponding to places which are closer to the user are configured to overlie place icons which are in the same direction but correspond to places which are farther away.
  • each user interface element icon is categorised as belonging to a particular class (e.g. a restaurant class comprising, in this case, two restaurant user interface element icons 421 b, 421 c).
  • the user is also provided with two dragging mode user interface element control icons 431 , 432 which allow the user to control how his interactions with the user place interface element icons are interpreted by the device/apparatus.
  • selection of one of the dragging mode user interface elements 431 , 432 may be considered to enable the configuring the graphical user interface to be in a corresponding respective dragging mode. That is, a dragging mode is enabled in response to the user interacting with a particular dragging mode user interface element displayed on the graphical user interface.
  • the single item dragging mode icon 431 is selected, the user can select and drag a single icon, whereas when the class dragging mode icon 432 is selected, the user can select and drag a class of user interface elements (e.g. all of the restaurant user interface element icons 421 b, 421 c), the selected class being an associated subset of the graphical user interface elements 421 a-421 d available for selection.
  • the user is looking for a museum user interface element. From the displayed image, as shown in figure 4a, user can see that there is a user interface element icon 421 d partially obscured by the two restaurant user interface elements 421 b, 421 c.
  • the user wishes to move the restaurant user interface element icons 421 b, 421 c away from their current position. That is, the user wishes to remove all of the place icons of the restaurant class. Therefore the user selects the class dragging mode icon 431 to enable interactions with classes of the user interface elements.
  • the apparatus/device is configured enable selection of one or more user interface element for dragging in response to detecting a lift gesture interaction with at least one of the one or more selected user interface elements, the lift gesture interaction comprising a contact interaction with the at least one user interface element using the stylus (in this case, the user's thumb 491 ) followed by lifting the stylus away from the at least one user interface element.
  • the apparatus/device is configured to enable selection of a class of place user interface elements in response to detecting a lift gesture interaction with one of the place user interface elements of that class.
  • the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g.
  • the apparatus is configured also to reduce the size of the dragged one or more user interface elements 421 b, 421 c. This allows more of the surrounding scene to be viewed.
  • the selection of the dragging mode the selection of the user interface elements for dragging and the dragging of the selected user interface elements may be performed using a single gesture interaction 441. That is this may be an efficient way of controlling the position of the user interface elements.
  • the apparatus is configured to enable ceasing the dragging hover gesture 441 y in response to the user providing a particular dragging termination gesture.
  • the user can provide a number of different dragging termination gestures depending on how the user wishes to control the one or more dragged user interface elements.
  • the user recognises that the previously obscured place user interface element icon 421 d indicates the direction of the train station rather than the museum he is looking for. The user therefore simply wishes to return the dragged restaurant place user interface element icons 421 b, 421 c back to their original positions.
  • the user provides a particular wagging gesture 442 by moving his finger back and forth in an x-y plane above, and parallel to, the screen.
  • the apparatus/device is configured to enable the selected user interface elements to return to the positions in which they were selected in response to detecting the ceasing of the dragging hover gesture. This is shown in figure 4c.
  • example embodiments may be configured to allow the user to select multiple user interface elements for hover dragging by successively selecting each of the multiple user interface elements (e.g. using successive lifting gestures) prior to the dragging of the user interface elements by hover dragging.
  • the user then moves the phone so that the camera is pointing more towards the left to see if the museum is in that direction.
  • the augmented reality application then is configured to update the live display collected by a camera (not shown) of the electronic device 401 and overlay place icons 421 a, 421 e-g on the updated image, the icons being associated with a place of interest (e.g. a theatre, shop or train station).
  • a place of interest e.g. a theatre, shop or train station.
  • each user interface element icon 421 a, 421 e-g is categorised as belonging to a particular class (e.g. a shop class comprising, in this case, two restaurant user interface element icons
  • the user is still looking for the museum user interface element icon 421f and can see that there is a user interface element icon 421f partially obscured by a shop user interface element 421 g.
  • the user wishes to move the obscuring shop user interface element icon 421 g away from their current position. That is, the user wishes to remove a single place user interface element icon. Therefore the user selects the single item dragging mode icon 431 to enable interactions with single place user interface element icons.
  • the apparatus/device is configured enable selection of the shop user interface element place icon 421 g in response to detecting a lift gesture interaction 443x with the shop user interface element 421 g, the lift gesture interaction 443x comprising a contact interaction with the shop user interface element 421 g using the stylus (the user's thumb 491 in this case) followed by lifting the stylus 491 away from the shop user interface element 421 g.
  • the lift gesture interaction 443x comprising a contact interaction with the shop user interface element 421 g using the stylus (the user's thumb 491 in this case) followed by lifting the stylus 491 away from the shop user interface element 421 g.
  • FIG.d That is, when the user provides a contact interaction with one of the shop place user interface elements 421 g (the one obscuring the museum place icon, in this case) and lifts his thumb 491 away from the screen (e.g. in the z direction), only that user interface element 421 g is selected.
  • the apparatus is configured to enable, during a dragging hover gesture interaction
  • the dragging hover gesture interaction 443y being provided within a predetermined hover range (e.g. between 0.5-3cm above the screen) of the graphical user interface using a stylus 491 (in this case, the stylus is the user's thumb). That is, during the time that the user provides the dragging hover gesture interaction 441 y towards the top of the screen, the selected shop place user interface element icon 421 g is correspondingly dragged towards the top of the screen (within the x-y plane of the screen).
  • a predetermined hover range e.g. between 0.5-3cm above the screen
  • the apparatus is configured to reduce the size of the dragged user interface element place icon 421 g (although resizing may not occur in other embodiments). This allows more of the surrounding scene to be viewed.
  • the apparatus is configured to enable ceasing the dragging hover gesture 443y in response to the user providing a particular dragging termination gesture.
  • the user can provide a number of different dragging termination gestures depending on how the user wishes to control the dragged icons.
  • the user recognises that the previously obscured place user interface element icon indicates the direction of the desired museum.
  • the user therefore wishes to remove the dragged shop place user interface element icon 421 g from the display 404 such that it no longer obscures the desired museum place user interface element icon 421f.
  • the user provides a particular flicking gesture 444 (e.g. a single rapid movement away from the screen).
  • the apparatus/device is configured to enable the dragged user interface element 421 g to be removed from the screen 404. This is shown in figure 4f.
  • example embodiments may be configured to detect other particular dragging termination gesture (e.g. a particular wagging action). It will be appreciated that other example embodiments may be configured to enable ceasing of the dragging hover gesture in response to one or more of: the user providing a contact interaction with the graphical user interface using the stylus; and the user moving the stylus outside the predetermined hover range.
  • other particular dragging termination gesture e.g. a particular wagging action
  • example embodiments may allow the user to control the dragging mode of the hover gestures in different way.
  • other example embodiments may be configured to associate particular dragging modes with different styli (e.g. different fingers). For example, when the user uses his thumb to select items, the single selection dragging mode may be used, whereas when the user uses his (e.g. index) finger as the stylus, the class dragging mode may be used.
  • the apparatus/device may be configured such that the size of the user interface elements selected for dragging (e.g. the area, and/or number of user interface elements selected) may correspond to the size of the user interaction (e.g.
  • a bigger size of user interface elements may be selected for dragging by a bigger size of selecting user interaction.
  • a user interaction with a smaller size e.g. where the finger is oriented to be approximately normal to the screen such that there is a small contact area
  • a user interaction with a larger size e.g. where the finger stylus is oriented to be at an angle to the screen to provide a larger contact area
  • Another example embodiment may be configured such that a multi- touch user interaction with a particular user interface element selects the class of user interface elements to which the particular user interface element belongs, whereas a single touch user interaction would just select the particular user interface element itself.
  • the user may control which user interface elements are selected for dragging by adjusting the size of the selecting user interaction. This may be intuitive for the user as it may replicate the action of physical glue. That is, the larger the size of interaction, the larger the adhesive force of the 'glue' (which allows larger and/or more user interface elements to be dragged).
  • the apparatus/device may be configured such that the size of the user interface elements being dragged (e.g. the area and/or number of user interface elements selected) may correspond to the size of the dragging hover user interaction (e.g. the number or volume of styli within the hover detection range). For example, if the user was using a multi- finger dragging hover interaction to drag multiple user interface elements, and then they removed some of the fingers from the predetermined hover range, the apparatus/device may be configured to cease dragging some of the multiple user interface elements.
  • the size of the user interface elements being dragged e.g. the area and/or number of user interface elements selected
  • the apparatus/device may be configured to cease dragging some of the multiple user interface elements.
  • user interface elements may be associated with an effective size (e.g. based on the type and or content associated with the user interface element). For example, a document icon associated with a longer document may be associated with a larger effective size than a document icon associated with a shorter document. Similarly different styli may be associated with different effective sizes. For example, a mechanical stylus may be associated with a larger effective size than a finger stylus.
  • Figures 5a - 5c depicts an example embodiment comprising a portable electronic communications device 501 , e.g.
  • a mobile phone with a user interface comprising a touch screen user interface 505, 504, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
  • data e.g. emails, textual messages, phone calls, information corresponding to web pages.
  • the device is displaying the home screen.
  • the home screen comprises a number of tile user interface elements including a navigation tile user interface element 521 a which may be used to open a navigation application.
  • the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 1 -4cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's finger 591 ).
  • a predetermined hover range e.g. between 1 -4cm above the screen
  • the apparatus/device is configured enable selection of the navigation tile user interface element 521 a for dragging in response to detecting a lift gesture 545x interaction with the navigation user interface element tile, the lift gesture interaction 545x comprising a contact interaction with the at least one user interface element using the stylus followed by lifting the stylus away from the contacted user interface element (in the z direction).
  • the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 1 -4cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's thumb). That is, during the time that the user provides a dragging hover gesture 545y, the user interface elements selected for dragging are correspondingly dragged along the screen (within the x-y plane of the screen). Unlike the previous embodiment, the apparatus/device is not configured to change the size of the icons as they are dragged along. It will be appreciated that the selection of user interface elements for dragging and the dragging of the user interface elements may be performed using a single user interaction 545.
  • a predetermined hover range e.g. between 1 -4cm above the screen
  • the apparatus/device is not configured to change the size of the icons as they are dragged along.
  • the apparatus is configured to enable ceasing the dragging hover gesture interaction 545y in response to the user providing a particular dragging termination gesture.
  • Figure 5b illustrates the way in which the user's dragging hover gesture interaction is detected in this case.
  • the apparatus/device 504 is shown from the side with a display/user interface 504 facing to the viewers left.
  • the apparatus/device 501 is configured to determine hover gesture interactions using a 3-D capacitive sensing user interface.
  • a 3-D capacitive sensing user interface may be considered to be 3-D map capable sensor technology, which can detect objects hovering within a predetermined distance within a capacitive field 506 of the display, to make 3-D capacitive heatmaps/scanning images.
  • the predetermined working distance in which objects may be detected may be up to 3cm away from the surface of the display screen in certain examples. It will be appreciated that other example embodiments may have different predetermined working distances (e.g. between 0.5 and 5cm).
  • This sensing technology is also able to sense and determine the position of a user's hand at a distance from the screen even if he is wearing gloves, thereby providing an advantage over, for example, a touch-sensitive screen which requires skin contact with the display screen.
  • the sensing technology may be able to distinguish between different styli (e.g. different mechanical styli, different fingers and thumbs).
  • the capacitive sensing technology may be called 3-D touch, hovering touch or touchless touch, and may comprise the capacitive display 504 of the device 501 in communication with a host computer/processor. Even in the case of a flat display (e.g. one which may be considered to have one exposed plane as the screen of the device 501 ), the capacitive field 506 can detect objects such as fingers and thumbs at the edges/sides of the device 501 and interface 504 as it can detect objects at a distance away from a direction perpendicular to the exposed plane (at a distance in the z direction away from the plane).
  • a flat display e.g. one which may be considered to have one exposed plane as the screen of the device 501
  • the capacitive field 506 can detect objects such as fingers and thumbs at the edges/sides of the device 501 and interface 504 as it can detect objects at a distance away from a direction perpendicular to the exposed plane (at a distance in the z direction away from the plane).
  • the 3-D capacitive sensor need not be part of a display screen 504, and may be integrated at any place of the device 501 , such as at the sides or on the back.
  • Touch controller algorithms can detect the user's hand/finger position in relation to the display 504 from changes in the capacitive field 506.
  • Another method is to transfer capacitive raw data to a host computer/processor from the apparatus/device 501 and run a hand detection algorithm at the host.
  • the capacitive raw data information can be transferred to the host from a touch controller interface of the apparatus/device 501 .
  • the user therefore wishes to fix the position of the dragged user interface element 521 a.
  • the user provides a contact gesture by touching the stylus onto the screen.
  • the apparatus/device is configured to enable the selected user interface elements to be fixed in the position where it was when the contact gesture was provided. This is shown in figure 5c.
  • some embodiments may be configured to enable corresponding hover dragging of one or more selected user interface elements using a non-portable electronic device with a hover sensitive screen (e.g. tablet computer of laptop).
  • a hover sensitive screen e.g. tablet computer of laptop
  • Figures 6a - 6c depicts two example embodiments, each comprising a portable electronic communications device 601 , 602 e.g. such as a mobile phone, with a user interface comprising a touch screen user interface 605, 604, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
  • a portable electronic communications device 601 , 602 e.g. such as a mobile phone
  • a user interface comprising a touch screen user interface 605, 604, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
  • data e.g. emails, textual messages, phone calls, information corresponding to web pages.
  • the apparatus/devices are configured to allow the user to drag user interface element to other connected devices.
  • Devices connected in this way may be considered to be separate components of the same composite device.
  • the first electronic device is the users work phone and the second electronic device is the user's home phone.
  • the two electronic devices have been connected (e.g. via a Bluetooth connection).
  • the connection between the devices allows the devices to share information regarding user provided interactions.
  • each individual device is providing a respective list of contacts, the list of contacts comprising a number of contact user interface elements.
  • the user wishes to move the 'Dan' contact user interface element 621 a from his home phone electronic device 601 to his work phone electronic device 602.
  • Figure 6a shows the situation when the 'Dan' contact user interface element 621 a has been selected.
  • the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 0.5-3cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's thumb). That is, during the time that the user provides a dragging hover gesture, the selected user interface elements are correspondingly dragged along the screen (within the x-y plane of either of the screen).
  • a predetermined hover range e.g. between 0.5-3cm above the screen
  • Figure 6b shows the situation as the user is dragging 646y the contact user interface element between the two user electronic devices.
  • the second electronic device is configured to display at least a portion of the dragged user interface element based on information detected by the capacitive touch screen user interface element of the first electronic device and transmitted to the second electronic device via the Bluetooth connection. That is, the two devices 601 , 602 are configured to share information relating to the position of the stylus above the two screens.
  • the apparatus is configured to enable ceasing the dragging hover gesture in response to the user providing a particular dragging termination gesture.
  • the user wishes to fix the position of the dragged user interface element on the second electronic device.
  • the user provides a release gesture by moving the stylus outside the predetermined hover range. This is shown in figure 6c.
  • the apparatus/device is configured to enable the selected user interface elements to be fixed in the position where it was when the contact gesture was provided. This is shown in figure 6c.
  • Figure 7a shows an example embodiment of an apparatus in communication with a remote server.
  • Figure 7b shows an example embodiment of an apparatus in communication with a "cloud" for cloud computing.
  • apparatus 701 (which may be apparatus 101 , 201 or 301 is in communication with a display 704).
  • the apparatus 701 and display 704 may form part of the same apparatus/device, although they may be separate as shown in the figures.
  • the apparatus 701 is also in communication with a remote computing element. Such communication may be via a communications unit, for example.
  • Figure 7a shows the remote computing element to be a remote server 795, with which the apparatus may be in wired or wireless communication (e.g.
  • the apparatus 701 is in communication with a remote cloud 796 (which may, for example, by the Internet, or a system of remote computers configured for cloud computing).
  • the enabling of a selection and/or dragging a user interface element may be performed at the remote computing element 795, 796.
  • the apparatus 701 may actually form part of the remote sever 795 or remote cloud 796.
  • Figure 8 illustrates the process flow according to an example embodiment of the present disclosure.
  • the process comprises enabling 881 during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
  • Figure 9 illustrates schematically a computer/processor readable medium 900 providing a program according to an embodiment.
  • the computer/processor readable medium is a disc such as a Digital Versatile Disc (DVD) or a compact disc (CD).
  • DVD Digital Versatile Disc
  • CD compact disc
  • the computer readable medium may be any medium that has been programmed in such a way as to carry out the functionality herein described.
  • the computer program code may be distributed between the multiple memories of the same type, or multiple memories of a different type, such as ROM, RAM, flash, hard disk, solid state, etc.
  • Any mentioned apparatus/device/server and/or other features of particular mentioned apparatus/device/server may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state).
  • the apparatus may comprise hardware circuitry and/or firmware.
  • the apparatus may comprise software loaded onto memory.
  • Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/ functional units.
  • a particular mentioned apparatus/device/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a "key", for example, to unlock/enable the software and its associated functionality.
  • Advantages associated with such embodiments can include a reduced requirement to download data when further functionality is required for a device, and this can be useful in examples where a device is perceived to have sufficient capacity to store such pre-programmed software for functionality that may not be enabled by a user.
  • Any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor.
  • One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal).
  • Any "computer” described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same device.
  • one or more of any mentioned processors may be distributed over a plurality of devices.
  • the same or different processor/processing elements may perform one or more functions described herein.
  • signal may refer to one or more signals transmitted as a series of transmitted and/or received electrical/optical signals.
  • the series of signals may comprise one, two, three, four or even more individual signal components or distinct signals to make up said signalling. Some or all of these individual signals may be transmitted/received by wireless or wired communication simultaneously, in sequence, and/or such that they temporally overlap one another.
  • processors and memory may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function.
  • ASIC Application Specific Integrated Circuit
  • FPGA field-programmable gate array

Abstract

An apparatus comprising: a processor; and a memory including computer program code, the memory and the computer program code configured, with the processor, to cause the apparatus to perform at least the following: enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.

Description

USER INTERFACES AND ASSOCIATED METHODS FOR CONTROLLING USER INTERFACE ELEMENTS
Technical Field
The present disclosure relates to user interfaces and user interface elements, associated methods, computer programs and apparatus. Certain disclosed aspects/embodiments relate to portable electronic devices, in particular, so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use). Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs), mobile telephones, smartphones and other smart devices, and tablet PCs.
The portable electronic devices/apparatus according to one or more disclosed aspects/embodiments may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/ Multimedia Message Service (MMS)/emailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions.
Background
It is common for electronic devices to provide a user interface (e.g. a graphical user interface). A user interface may enable a user to interact with an electronic device, for example, to open applications using application icons, enter commands, to select menu items from a menu, or to enter characters using a virtual keypad.
The listing or discussion of a prior-published document or any background in this specification should not necessarily be taken as an acknowledgement that the document or background is part of the state of the art or is common general knowledge. One or more aspects/embodiments of the present disclosure may or may not address one or more of the background issues.
Summary
In a first aspect there is provided an apparatus comprising:
a processor; and
a memory including computer program code, the memory and the computer program code configured, with the processor, to cause the apparatus to perform at least the following:
enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
Thus the dragging hover gesture provides for corresponding dragging of the one or more selected user interface elements.
The apparatus may be configured to enable selection of one or more user interface element in response to detecting a lift gesture interaction with at least one of the one or more selected user interface elements. The lift gesture interaction may comprise a contact interaction with the at least one user interface element using the stylus followed by lifting the stylus away from the at least one user interface element. The lift gesture interaction may comprise moving the stylus away from the at least one user interface element (e.g. with or without an initial contact interaction with the screen). That is, if the plane of the screen lies in an x-y plane in an orthogonal coordinate system, the lift gesture interaction may be aligned with the z-axis of the coordinate system. It will be appreciated that for embodiments with non-planar screens the lift gesture may be away from the screen.
The predetermined hover range may comprise a range of distances above the screen (e.g. between 0.1 and 3cm in the z-direction above the screen). It will be appreciated that a hover gesture interaction may be considered to exclude contact interactions (i.e. interactions where the stylus is touching the graphical user interface).
The selection dragging hover gesture interaction may be enabled by configuring the graphical user interface to be in a dragging mode. The apparatus may be configured to enable the dragging mode of the graphical user interface in response to the user interacting with a particular dragging mode user interface control element (e.g. a virtual 'glue pot') displayed on the graphical user interface. There may be different dragging mode user interface elements associated with different functions. For example, one dragging mode user interface control element may allow user interface elements associated with one or more of: people, numbers or specific topics to be selected. For multiple devices which are connected, the dragging hover gesture interaction may allow the user to drag the one or more selected user interface elements between displays of the respective multiple devices. The selection dragging hover gesture interaction may be enabled when a particular application is active. The selection dragging hover gesture interaction may be enabled by the operating system (e.g. it may be available to multiple different applications and/or for user interface elements displayed on a home screen or in a locked mode). The apparatus may be configured to enable dragging of a selected class of user interface elements, the selected class being an associated subset of the graphical user interface elements available for selection (e.g. allowing mobile telephone numbers to be dragged from a list of telephone numbers comprising mobile and land line telephone numbers). The apparatus may be configured to enable the selected user interface elements to return to the positions in which they were selected in response to detecting the ceasing of the dragging hover gesture.
The apparatus may be configured, in response to detecting the ceasing of the dragging hover gesture, to enable the selected user interface elements to be positioned corresponding to the position of the stylus when the dragging hover gesture was detected to have ceased.
The apparatus may be configured to enable ceasing of the dragging hover gesture in response to one or more of:
the user providing a particular dragging termination gesture;
the user providing a contact interaction with the graphical user interface using the stylus; and
the user moving the stylus outside the predetermined hover range. The dragging termination gesture may be a particular waving action (e.g. a gentle back and forth or up and down motion), a particular wagging action (e.g. a jerky to and fro or up and down motion), or a particular flicking action (e.g. a light sharp jerky stroke or movement). The apparatus/device may be configured to provide auditory and/or haptic feedback when the user is interacting with the user interface elements (e.g. to provide feedback when the one or more user interface elements are selected; being dragged; and/or when a dragging termination gesture is provided). A user interface element may be a menu item, an icon, a tile, drawing user interface elements (e.g. shapes, lines or arrows), or a page in an e-book.
A stylus may comprise: a finger; a gloved finger; a hand; a gloved hand; a stylus tool; a pen; a pencil; a mechanical stylus; a substantially tubular object; and a substantially cylindrical object.
The apparatus may be a portable electronic device, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a server, a non-portable electronic device, a desktop computer, a monitor, a television, an automated teller machine (ATM), a server, a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same.
According to a further aspect, there is provided a method, the method comprising:
enabling, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
According to a further aspect, there is provided a computer program comprising computer program code, the computer program code being configured to perform at least the following: enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
According to a further aspect there is provided an apparatus comprising:
means for enabling configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
According to a further aspect there is provided an apparatus comprising:
an enabler configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus. The present disclosure includes one or more corresponding aspects, embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation. Corresponding means and corresponding function units (e.g. a determiner, an enabler) for performing one or more of the discussed functions are also within the present disclosure.
Corresponding computer programs (which may or may not be recorded on a carrier, such as a CD or other non-transitory medium) for implementing one or more of the methods disclosed herein are also within the present disclosure and encompassed by one or more of the described example embodiments.
The above summary is intended to be merely exemplary and non-limiting.
Brief Description of the Figures
A description is now given, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 depicts an example apparatus embodiment comprising a number of electronic components, including memory and a processor;
Figure 2 depicts an example apparatus embodiment comprising a number of electronic components, including memory, a processor and a communication unit;
Figure 3 depicts an example apparatus embodiment comprising a number of electronic components, including memory and a processor;
Figure 4a-4f depicts an example embodiment wherein the user is interacting with an application; Figure 5a-5c depicts a further example embodiment wherein the user is interacting with a home screen;
Figure 6a-6c depicts a further example embodiment wherein the user is interacting with multiple electronic devices;
Figures 7a-7b illustrate an example apparatus in communication with a remote server/cloud; Figure 8 illustrates a flowchart according to an example method of the present disclosure; and Figure 9 illustrates schematically a computer readable medium providing a program.
Description of Example Aspects/Embodiments
It is common for an electronic device to have a user interface (which may or may not be graphically based) to allow a user to interact with the device to enter and/or interact with information. For example, the user may use a keyboard user interface to enter text, or use icons to open applications. Some user interfaces include displays, such as touch screens, which can display information to the user.
On an electronic device, it may be advantageous if new information is provided to the user. For example, new user interface elements may be provided to notify the user that new voicemail has become available, or a pop-up window may be provided to indicate that a software update should be downloaded. However, providing too much information in the form of additional user interface elements may cause the graphical user interface to be come cluttered. The present disclosure relates to enabling, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus. This may allow the user to better and more intuitively control the position of the user interface elements provided on a display of an electronic device. In addition, by allowing the user to be able to hover drag the user interface elements with a stylus, the user may control the position of the user interface elements with, for example, a finger or a thumb. This may be particularly advantageous for portable electronic devices where multi-touch interactions (such as a pinch gesture) may be more difficult to provide as the user is generally also holding the device. That is, certain embodiments may allow the user to drag user interface elements using the same hand that is holding the device.
It will be appreciated that the dragging hover gesture interaction may be intuitive as it replicates the effect of physical glue. That is, similar to the real world interaction where a user might use a sticky stylus (e.g. one covered with glue) to lift objects and drag them to the desired position, presently disclosed embodiments may allow a user to 'lift' user interface elements (i.e. by selecting them using a lift gesture) and drag them to a desired position.
Other embodiments depicted in the figures have been provided with reference numerals that correspond to similar features of earlier described embodiments. For example, feature number 1 can also correspond to numbers 101 , 201 , 301 etc. These numbered features may appear in the figures but may not have been directly referred to within the description of these particular embodiments. These have still been provided in the figures to aid understanding of the further embodiments, particularly in relation to the features of similar earlier described embodiments.
Figure 1 shows an apparatus 101 comprising memory 107, a processor 108, input I and output O. In this embodiment only one processor and one memory are shown but it will be appreciated that other embodiments may utilise more than one processor and/or more than one memory (e.g. same or different processor/memory types). In this embodiment the apparatus 101 is an Application Specific Integrated Circuit (ASIC) for a portable electronic device with a touch sensitive display. In other embodiments the apparatus 101 can be a module for such a device, or may be the device itself, wherein the processor 108 is a general purpose CPU of the device and the memory 107 is general purpose memory comprised by the device.
The input I allows for receipt of signalling to the apparatus 101 from further components, such as components of a portable electronic device (like a touch-sensitive display) or the like. The output O allows for onward provision of signalling from within the apparatus 101 to further components. In this embodiment the input I and output O are part of a connection bus that allows for connection of the apparatus 101 to further components.
The processor 108 is a general purpose processor dedicated to executing/processing information received via the input I in accordance with instructions stored in the form of computer program code on the memory 107. The output signalling generated by such operations from the processor 108 is provided onwards to further components via the output O.
The memory 107 (not necessarily a single memory unit) is a computer readable medium (solid state memory in this example, but may be other types of memory such as a hard drive, ROM, RAM, Flash or the like) that stores computer program code. This computer program code stores instructions that are executable by the processor 108, when the program code is run on the processor 108. The internal connections between the memory 107 and the processor 108 can be understood to, in one or more example embodiments, provide an active coupling between the processor 108 and the memory 107 to allow the processor 108 to access the computer program code stored on the memory 107.
In this example the input I, output O, processor 108 and memory 107 are all electrically connected to one another internally to allow for electrical communication between the respective components I, O, 108, 107. In this example the components are all located proximate to one another so as to be formed together as an ASIC, in other words, so as to be integrated together as a single chip/circuit that can be installed into an electronic device. In other examples one or more or all of the components may be located separately from one another.
Figure 2 depicts an apparatus 201 of a further example embodiment, such as a mobile phone. In other example embodiments, the apparatus 201 may comprise a module for a mobile phone (or PDA or audio/video player), and may just comprise a suitably configured memory 207 and processor 208. The apparatus in certain embodiments could be a portable electronic device, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a server, a non-portable electronic device, a desktop computer, a monitor, a server, a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same The example embodiment of figure 2, in this case, comprises a display device 204 such as, for example, a Liquid Crystal Display (LCD) or touch-screen user interface. The apparatus 201 of figure 2 is configured such that it may receive, include, and/or otherwise access data. For example, this example embodiment 201 comprises a communications unit 203, such as a receiver, transmitter, and/or transceiver, in communication with an antenna 202 for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of networks. This example embodiment comprises a memory 207 that stores data, possibly after being received via antenna 202 or port or after being generated at the user interface 205. The processor 208 may receive data from the user interface 205, from the memory 207, or from the communication unit 203. It will be appreciated that, in certain example embodiments, the display device 204 may incorporate the user interface 205. Regardless of the origin of the data, these data may be outputted to a user of apparatus 201 via the display device 204, and/or any other output devices provided with apparatus. The processor 208 may also store the data for later use in the memory 207. The memory 207 may store computer program code and/or applications which may be used to instruct/enable the processor 208 to perform functions (e.g. read, write, delete, edit or process data).
Figure 3 depicts a further example embodiment of an electronic device 301 , such as a tablet personal computer, a portable electronic device, a portable telecommunications device, a server or a module for such a device, the device comprising the apparatus 101 of figure 1 . The apparatus 101 can be provided as a module for device 301 , or even as a processor/memory for the device 301 or a processor/memory for a module for such a device 301. The device 301 comprises a processor 308 and a storage medium 307, which are connected (e.g. electrically and/or wirelessly) by a data bus 380. This data bus 380 can provide an active coupling between the processor 308 and the storage medium 307 to allow the processor 308 to access the computer program code. It will be appreciated that the components (e.g. memory, processor) of the device/apparatus may be linked via cloud computing architecture. For example, the storage device may be a remote server accessed via the internet by the processor. The apparatus 101 in figure 3 is connected (e.g. electrically and/or wirelessly) to an input/output interface 370 that receives the output from the apparatus 101 and transmits this to the device 301 via data bus 380. Interface 370 can be connected via the data bus 380 to a display 304 (touch-sensitive or otherwise) that provides information from the apparatus 101 to a user. Display 304 can be part of the device 301 or can be separate. The device 301 also comprises a processor 308 configured for general control of the apparatus 101 as well as the device 301 by providing signalling to, and receiving signalling from, other device components to manage their operation.
The storage medium 307 is configured to store computer code configured to perform, control or enable the operation of the apparatus 101. The storage medium 307 may be configured to store settings for the other device components. The processor 308 may access the storage medium 307 to retrieve the component settings in order to manage the operation of the other device components. The storage medium 307 may be a temporary storage medium such as a volatile random access memory. The storage medium 307 may also be a permanent storage medium such as a hard disk drive, a flash memory, a remote server (such as cloud storage) or a nonvolatile random access memory. The storage medium 307 could be composed of different combinations of the same or different memory types.
Figures 4a - 4d depicts a user holding an example embodiment comprising a portable electronic communications device 401 , e.g. such as a mobile phone, with a user interface comprising a touch screen user interface 405, 404, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
In the present example, as shown in figure 4a, the user is using an augmented reality application to view the surrounding scene. The augmented reality application is configured to display live image data collected by a camera (not shown) of the electronic device 401 and display it on a touch sensitive screen 404, 405. The augmented reality application is also configured to overlay place user interface element icons 421 a-421 d on the displayed image, the place user interface element icons 421 a-421 d being associated with a place of interest (e.g. a theatre, shop or train station) and positioned on the screen with respect to the image so as to indicate the direction of the place of interest from the point of view of the user. In this case, place icons corresponding to places which are closer to the user are configured to overlie place icons which are in the same direction but correspond to places which are farther away.
In the situation shown in figure 4a, there are four place icons displayed: a theatre user interface element icon 421 a; a train station user interface element icon 421 d; and two restaurant user interface element icons 421 b, 421 c. That is, in this embodiment, each user interface element icon is categorised as belonging to a particular class (e.g. a restaurant class comprising, in this case, two restaurant user interface element icons 421 b, 421 c). In this case, the user is also provided with two dragging mode user interface element control icons 431 , 432 which allow the user to control how his interactions with the user place interface element icons are interpreted by the device/apparatus. That is, selection of one of the dragging mode user interface elements 431 , 432 may be considered to enable the configuring the graphical user interface to be in a corresponding respective dragging mode. That is, a dragging mode is enabled in response to the user interacting with a particular dragging mode user interface element displayed on the graphical user interface. In particular, when the single item dragging mode icon 431 is selected, the user can select and drag a single icon, whereas when the class dragging mode icon 432 is selected, the user can select and drag a class of user interface elements (e.g. all of the restaurant user interface element icons 421 b, 421 c), the selected class being an associated subset of the graphical user interface elements 421 a-421 d available for selection.
In this case, the user is looking for a museum user interface element. From the displayed image, as shown in figure 4a, user can see that there is a user interface element icon 421 d partially obscured by the two restaurant user interface elements 421 b, 421 c.
To see the obscured user interface control icon 421 d (e.g. to determine whether it is a museum user interface element icon), the user wishes to move the restaurant user interface element icons 421 b, 421 c away from their current position. That is, the user wishes to remove all of the place icons of the restaurant class. Therefore the user selects the class dragging mode icon 431 to enable interactions with classes of the user interface elements.
In this case, the apparatus/device is configured enable selection of one or more user interface element for dragging in response to detecting a lift gesture interaction with at least one of the one or more selected user interface elements, the lift gesture interaction comprising a contact interaction with the at least one user interface element using the stylus (in this case, the user's thumb 491 ) followed by lifting the stylus away from the at least one user interface element. In the situation depicted in figure 4a when the class dragging mode icon 431 has been selected, the apparatus/device is configured to enable selection of a class of place user interface elements in response to detecting a lift gesture interaction with one of the place user interface elements of that class. That is, when the user provides a contact interaction with one of the restaurant place user interface elements 421 b, 421 c and lifts his finger stylus 491 away from the screen 404, 405 (e.g. in the z direction), the other members of that user interface element class is also selected. This is shown in figure 4a, where the user provides a lifting gesture 441 x with one of the restaurant user interface element icons 421 b to select both of the restaurant user interface element icons 421 b, 421 c. In this case, the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 0.5-3cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's thumb). That is, during the time that the user provides a dragging hover gesture 441 y towards the top of the screen (as shown in figure 4a), the selected user interface elements 421 b, 421 c are correspondingly dragged towards the top of the screen (within the x-y plane of the screen). In this case, as the selected one or more user interface elements 421 b, 421 c are dragged away from their original positions, the apparatus is configured also to reduce the size of the dragged one or more user interface elements 421 b, 421 c. This allows more of the surrounding scene to be viewed. In other embodiments, of course, resizing may not occur. It will be appreciated, as shown in figure 4a, that the selection of the dragging mode, the selection of the user interface elements for dragging and the dragging of the selected user interface elements may be performed using a single gesture interaction 441. That is this may be an efficient way of controlling the position of the user interface elements. In this case, the apparatus is configured to enable ceasing the dragging hover gesture 441 y in response to the user providing a particular dragging termination gesture. In this case, the user can provide a number of different dragging termination gestures depending on how the user wishes to control the one or more dragged user interface elements. In the situation depicted in figure 4b (which shows the situation after the icons have been dragged to the top of the screen), the user recognises that the previously obscured place user interface element icon 421 d indicates the direction of the train station rather than the museum he is looking for. The user therefore simply wishes to return the dragged restaurant place user interface element icons 421 b, 421 c back to their original positions. To do this, the user provides a particular wagging gesture 442 by moving his finger back and forth in an x-y plane above, and parallel to, the screen. In response to detecting the particular wagging gesture 442, the apparatus/device is configured to enable the selected user interface elements to return to the positions in which they were selected in response to detecting the ceasing of the dragging hover gesture. This is shown in figure 4c.
It will be appreciated that other example embodiments may be configured to allow the user to select multiple user interface elements for hover dragging by successively selecting each of the multiple user interface elements (e.g. using successive lifting gestures) prior to the dragging of the user interface elements by hover dragging.
The user then moves the phone so that the camera is pointing more towards the left to see if the museum is in that direction. The augmented reality application then is configured to update the live display collected by a camera (not shown) of the electronic device 401 and overlay place icons 421 a, 421 e-g on the updated image, the icons being associated with a place of interest (e.g. a theatre, shop or train station). In the situation shown in figure 4d, there are four place icons displayed: the theatre user interface element icon 421 a (this is the same icons as shown in figure 4a but moved with respect to the screen to reflect the new relative direction with respect to the user/camera); a museum user interface element icon 421f; and two shop user interface element icons 421 e, 421 g. As before, in this embodiment, each user interface element icon 421 a, 421 e-g is categorised as belonging to a particular class (e.g. a shop class comprising, in this case, two restaurant user interface element icons).
The user is still looking for the museum user interface element icon 421f and can see that there is a user interface element icon 421f partially obscured by a shop user interface element 421 g.
To see the obscured user interface control icon 421f (e.g. to determine whether it is a museum user interface element icon), the user wishes to move the obscuring shop user interface element icon 421 g away from their current position. That is, the user wishes to remove a single place user interface element icon. Therefore the user selects the single item dragging mode icon 431 to enable interactions with single place user interface element icons.
In this case, the apparatus/device is configured enable selection of the shop user interface element place icon 421 g in response to detecting a lift gesture interaction 443x with the shop user interface element 421 g, the lift gesture interaction 443x comprising a contact interaction with the shop user interface element 421 g using the stylus (the user's thumb 491 in this case) followed by lifting the stylus 491 away from the shop user interface element 421 g. This is shown in figure 4d. That is, when the user provides a contact interaction with one of the shop place user interface elements 421 g (the one obscuring the museum place icon, in this case) and lifts his thumb 491 away from the screen (e.g. in the z direction), only that user interface element 421 g is selected.
In this case, the apparatus is configured to enable, during a dragging hover gesture interaction
443y, corresponding dragging of the shop selected user interface elements of a graphical user interface, the dragging hover gesture interaction 443y being provided within a predetermined hover range (e.g. between 0.5-3cm above the screen) of the graphical user interface using a stylus 491 (in this case, the stylus is the user's thumb). That is, during the time that the user provides the dragging hover gesture interaction 441 y towards the top of the screen, the selected shop place user interface element icon 421 g is correspondingly dragged towards the top of the screen (within the x-y plane of the screen). In this case, as the shop user interface element 421 g place icon is dragged away from their original positions, the apparatus is configured to reduce the size of the dragged user interface element place icon 421 g (although resizing may not occur in other embodiments). This allows more of the surrounding scene to be viewed.
In this case, the apparatus is configured to enable ceasing the dragging hover gesture 443y in response to the user providing a particular dragging termination gesture. As noted above, the user can provide a number of different dragging termination gestures depending on how the user wishes to control the dragged icons.
In the situation depicted in figure 4e (which shows the situation after the icons have been dragged to the top of the screen), the user recognises that the previously obscured place user interface element icon indicates the direction of the desired museum. The user therefore wishes to remove the dragged shop place user interface element icon 421 g from the display 404 such that it no longer obscures the desired museum place user interface element icon 421f. To do this, the user provides a particular flicking gesture 444 (e.g. a single rapid movement away from the screen). In response to detecting the particular flicking gesture 444, the apparatus/device is configured to enable the dragged user interface element 421 g to be removed from the screen 404. This is shown in figure 4f.
It will be appreciated that other example embodiments may be configured to detect other particular dragging termination gesture (e.g. a particular wagging action). It will be appreciated that other example embodiments may be configured to enable ceasing of the dragging hover gesture in response to one or more of: the user providing a contact interaction with the graphical user interface using the stylus; and the user moving the stylus outside the predetermined hover range.
It will be appreciated that other example embodiments may allow the user to control the dragging mode of the hover gestures in different way. For example, other example embodiments may be configured to associate particular dragging modes with different styli (e.g. different fingers). For example, when the user uses his thumb to select items, the single selection dragging mode may be used, whereas when the user uses his (e.g. index) finger as the stylus, the class dragging mode may be used. It will be appreciated that the apparatus/device may be configured such that the size of the user interface elements selected for dragging (e.g. the area, and/or number of user interface elements selected) may correspond to the size of the user interaction (e.g. the contact area for a contact interaction, the area delimited by the contact points for a multi-touch interaction, the number or contact points and/or volume of styli within the hover detection range). For example, a bigger size of user interface elements may be selected for dragging by a bigger size of selecting user interaction. For example, if there were a plurality of overlapping user interface elements, a user interaction with a smaller size (e.g. where the finger is oriented to be approximately normal to the screen such that there is a small contact area) may be used to select the uppermost layer, whereas a user interaction with a larger size (e.g. where the finger stylus is oriented to be at an angle to the screen to provide a larger contact area) may be used to select underlying layers in addition to the uppermost layer. Another example embodiment may be configured such that a multi- touch user interaction with a particular user interface element selects the class of user interface elements to which the particular user interface element belongs, whereas a single touch user interaction would just select the particular user interface element itself. In this way, the user may control which user interface elements are selected for dragging by adjusting the size of the selecting user interaction. This may be intuitive for the user as it may replicate the action of physical glue. That is, the larger the size of interaction, the larger the adhesive force of the 'glue' (which allows larger and/or more user interface elements to be dragged).
It will be appreciated that the apparatus/device may be configured such that the size of the user interface elements being dragged (e.g. the area and/or number of user interface elements selected) may correspond to the size of the dragging hover user interaction (e.g. the number or volume of styli within the hover detection range). For example, if the user was using a multi- finger dragging hover interaction to drag multiple user interface elements, and then they removed some of the fingers from the predetermined hover range, the apparatus/device may be configured to cease dragging some of the multiple user interface elements.
It will be appreciated that user interface elements may be associated with an effective size (e.g. based on the type and or content associated with the user interface element). For example, a document icon associated with a longer document may be associated with a larger effective size than a document icon associated with a shorter document. Similarly different styli may be associated with different effective sizes. For example, a mechanical stylus may be associated with a larger effective size than a finger stylus. Figures 5a - 5c depicts an example embodiment comprising a portable electronic communications device 501 , e.g. such as a mobile phone, with a user interface comprising a touch screen user interface 505, 504, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
In figure 5a the device is displaying the home screen. The home screen comprises a number of tile user interface elements including a navigation tile user interface element 521 a which may be used to open a navigation application.
In this case, the user wishes to drag the navigation tile user interface 521 a to a new position on the home screen. In this case, the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 1 -4cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's finger 591 ).
In this case, the apparatus/device is configured enable selection of the navigation tile user interface element 521 a for dragging in response to detecting a lift gesture 545x interaction with the navigation user interface element tile, the lift gesture interaction 545x comprising a contact interaction with the at least one user interface element using the stylus followed by lifting the stylus away from the contacted user interface element (in the z direction).
In this case, the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 1 -4cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's thumb). That is, during the time that the user provides a dragging hover gesture 545y, the user interface elements selected for dragging are correspondingly dragged along the screen (within the x-y plane of the screen). Unlike the previous embodiment, the apparatus/device is not configured to change the size of the icons as they are dragged along. It will be appreciated that the selection of user interface elements for dragging and the dragging of the user interface elements may be performed using a single user interaction 545.
In this case, the apparatus is configured to enable ceasing the dragging hover gesture interaction 545y in response to the user providing a particular dragging termination gesture. Figure 5b illustrates the way in which the user's dragging hover gesture interaction is detected in this case. The apparatus/device 504 is shown from the side with a display/user interface 504 facing to the viewers left. The apparatus/device 501 is configured to determine hover gesture interactions using a 3-D capacitive sensing user interface. A 3-D capacitive sensing user interface may be considered to be 3-D map capable sensor technology, which can detect objects hovering within a predetermined distance within a capacitive field 506 of the display, to make 3-D capacitive heatmaps/scanning images. These images can be used to identify the shape and size and location of, for example, a hand grip in 3-D space within the detection field 506 (e.g. a 3D mesh). The predetermined working distance in which objects may be detected may be up to 3cm away from the surface of the display screen in certain examples. It will be appreciated that other example embodiments may have different predetermined working distances (e.g. between 0.5 and 5cm). This sensing technology is also able to sense and determine the position of a user's hand at a distance from the screen even if he is wearing gloves, thereby providing an advantage over, for example, a touch-sensitive screen which requires skin contact with the display screen. The sensing technology may be able to distinguish between different styli (e.g. different mechanical styli, different fingers and thumbs).
The capacitive sensing technology may be called 3-D touch, hovering touch or touchless touch, and may comprise the capacitive display 504 of the device 501 in communication with a host computer/processor. Even in the case of a flat display (e.g. one which may be considered to have one exposed plane as the screen of the device 501 ), the capacitive field 506 can detect objects such as fingers and thumbs at the edges/sides of the device 501 and interface 504 as it can detect objects at a distance away from a direction perpendicular to the exposed plane (at a distance in the z direction away from the plane). Thus, a particular finger/thumb stylus may readily be identified, because the user's palm, fingers and thumb may be detectable by the capacitive field 506 even if they are at the edges and even at the back of the device 501. In certain examples, the 3-D capacitive sensor need not be part of a display screen 504, and may be integrated at any place of the device 501 , such as at the sides or on the back. When a user grips a device 501 with such a 3-D capacitive sensing user interface 506, the capacitive field 506 changes. Touch controller algorithms can detect the user's hand/finger position in relation to the display 504 from changes in the capacitive field 506. Another method is to transfer capacitive raw data to a host computer/processor from the apparatus/device 501 and run a hand detection algorithm at the host. The capacitive raw data information can be transferred to the host from a touch controller interface of the apparatus/device 501 .
In this case, after the dragging hover gesture 545y has dragged the navigation tile user interface element 521 a to the desired location the user therefore wishes to fix the position of the dragged user interface element 521 a. To do this, the user provides a contact gesture by touching the stylus onto the screen. In response to detecting the contact gesture, the apparatus/device is configured to enable the selected user interface elements to be fixed in the position where it was when the contact gesture was provided. This is shown in figure 5c.
Although the above described embodiments are portable electronic devices, it will be appreciated that some embodiments may be configured to enable corresponding hover dragging of one or more selected user interface elements using a non-portable electronic device with a hover sensitive screen (e.g. tablet computer of laptop).
Figures 6a - 6c depicts two example embodiments, each comprising a portable electronic communications device 601 , 602 e.g. such as a mobile phone, with a user interface comprising a touch screen user interface 605, 604, a memory (not shown), a processor (not shown) and an antenna (not shown) for transmitting and/or receiving data (e.g. emails, textual messages, phone calls, information corresponding to web pages).
In this case, the apparatus/devices are configured to allow the user to drag user interface element to other connected devices. Devices connected in this way may be considered to be separate components of the same composite device. In this case the first electronic device is the users work phone and the second electronic device is the user's home phone. In this case, the two electronic devices have been connected (e.g. via a Bluetooth connection). The connection between the devices allows the devices to share information regarding user provided interactions. In this case, each individual device is providing a respective list of contacts, the list of contacts comprising a number of contact user interface elements. In this example, the user wishes to move the 'Dan' contact user interface element 621 a from his home phone electronic device 601 to his work phone electronic device 602. To do this, the user selects the 'Dan' contact user interface element 621 a (e.g. using a lift gesture or otherwise) and provides a dragging hover gesture. Figure 6a shows the situation when the 'Dan' contact user interface element 621 a has been selected.
In this case, the apparatus is configured to enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range (e.g. between 0.5-3cm above the screen) of the graphical user interface using a stylus (in this case, the stylus is the user's thumb). That is, during the time that the user provides a dragging hover gesture, the selected user interface elements are correspondingly dragged along the screen (within the x-y plane of either of the screen).
Figure 6b shows the situation as the user is dragging 646y the contact user interface element between the two user electronic devices. In this case, the second electronic device is configured to display at least a portion of the dragged user interface element based on information detected by the capacitive touch screen user interface element of the first electronic device and transmitted to the second electronic device via the Bluetooth connection. That is, the two devices 601 , 602 are configured to share information relating to the position of the stylus above the two screens.
In this case, the apparatus is configured to enable ceasing the dragging hover gesture in response to the user providing a particular dragging termination gesture. In this case, the user wishes to fix the position of the dragged user interface element on the second electronic device. To do this, the user provides a release gesture by moving the stylus outside the predetermined hover range. This is shown in figure 6c. In response to detecting the release gesture 647, the apparatus/device is configured to enable the selected user interface elements to be fixed in the position where it was when the contact gesture was provided. This is shown in figure 6c.
Figure 7a shows an example embodiment of an apparatus in communication with a remote server. Figure 7b shows an example embodiment of an apparatus in communication with a "cloud" for cloud computing. In figures 7a and 7b, apparatus 701 (which may be apparatus 101 , 201 or 301 is in communication with a display 704). Of course, the apparatus 701 and display 704 may form part of the same apparatus/device, although they may be separate as shown in the figures. The apparatus 701 is also in communication with a remote computing element. Such communication may be via a communications unit, for example. Figure 7a shows the remote computing element to be a remote server 795, with which the apparatus may be in wired or wireless communication (e.g. via the internet, Bluetooth, a USB connection, or any other suitable connection as known to one skilled in the art). In figure 7b, the apparatus 701 is in communication with a remote cloud 796 (which may, for example, by the Internet, or a system of remote computers configured for cloud computing). The enabling of a selection and/or dragging a user interface element may be performed at the remote computing element 795, 796. The apparatus 701 may actually form part of the remote sever 795 or remote cloud 796. Figure 8 illustrates the process flow according to an example embodiment of the present disclosure. The process comprises enabling 881 during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
Figure 9 illustrates schematically a computer/processor readable medium 900 providing a program according to an embodiment. In this example, the computer/processor readable medium is a disc such as a Digital Versatile Disc (DVD) or a compact disc (CD). In other embodiments, the computer readable medium may be any medium that has been programmed in such a way as to carry out the functionality herein described. The computer program code may be distributed between the multiple memories of the same type, or multiple memories of a different type, such as ROM, RAM, flash, hard disk, solid state, etc.
Any mentioned apparatus/device/server and/or other features of particular mentioned apparatus/device/server may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state). The apparatus may comprise hardware circuitry and/or firmware. The apparatus may comprise software loaded onto memory. Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/ functional units.
In some embodiments, a particular mentioned apparatus/device/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a "key", for example, to unlock/enable the software and its associated functionality. Advantages associated with such embodiments can include a reduced requirement to download data when further functionality is required for a device, and this can be useful in examples where a device is perceived to have sufficient capacity to store such pre-programmed software for functionality that may not be enabled by a user.
Any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor. One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal). Any "computer" described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same device. In some embodiments one or more of any mentioned processors may be distributed over a plurality of devices. The same or different processor/processing elements may perform one or more functions described herein.
The term "signalling" may refer to one or more signals transmitted as a series of transmitted and/or received electrical/optical signals. The series of signals may comprise one, two, three, four or even more individual signal components or distinct signals to make up said signalling. Some or all of these individual signals may be transmitted/received by wireless or wired communication simultaneously, in sequence, and/or such that they temporally overlap one another.
With reference to any discussion of any mentioned computer and/or processor and memory (e.g. including ROM, CD-ROM etc), these may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole, in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that the disclosed aspects/embodiments may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the disclosure.
While there have been shown and described and pointed out fundamental novel features as applied to example embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices and methods described may be made by those skilled in the art without departing from the spirit of the disclosure. For example, it is expressly intended that all combinations of those elements and/or method steps which perform substantially the same function in substantially the same way to achieve the same results are within the scope of the disclosure. Moreover, it should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or embodiments may be incorporated in any other disclosed or described or suggested form or embodiment as a general matter of design choice. Furthermore, in the claims means- plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures. Thus although a nail and a screw may not be structural equivalents in that a nail employs a cylindrical surface to secure wooden parts together, whereas a screw employs a helical surface, in the environment of fastening wooden parts, a nail and a screw may be equivalent structures.

Claims

1 . An apparatus comprising:
a processor; and
a memory including computer program code,
the memory and the computer program code configured, with the processor, to cause the apparatus to perform at least the following:
enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
2. The apparatus of claim 1 , wherein the apparatus is configured to enable selection of one or more user interface element for dragging in response to detecting a lift gesture interaction with at least one of the one or more selected user interface elements, the lift gesture interaction comprising a contact interaction with the at least one user interface element using the stylus followed by lifting the stylus away from the at least one user interface element.
3. The apparatus of claim 1 , wherein the selection dragging hover gesture interaction is enabled by configuring the graphical user interface to be in a dragging mode.
4. The apparatus of claim 3, wherein the apparatus is configured to enable the dragging mode of the graphical user interface in response to the user interacting with a particular dragging mode user interface element displayed on the graphical user interface.
5. The apparatus of claim 1 , wherein the apparatus is configured to enable dragging of a selected class of user interface elements, the selected class being an associated subset of the graphical user interface elements available for selection.
6. The apparatus of claim 1 , wherein the apparatus is configured to enable the selected user interface elements to return to the positions in which they were selected in response to detecting the ceasing of the dragging hover gesture.
7. The apparatus of claim 1 , wherein the apparatus is configured, in response to detecting the ceasing of the dragging hover gesture, to enable the selected user interface elements to be positioned corresponding to the position of the stylus when the dragging hover gesture was detected to have ceased.
8. The apparatus of claim 1 , wherein the apparatus is configured to enable ceasing of the dragging hover gesture in response to one or more of:
the user providing a particular dragging termination gesture;
the user providing a contact interaction with the graphical user interface using the stylus; and
the user moving the stylus outside the predetermined hover range.
9. The apparatus of claim 8, wherein the dragging termination gesture is a particular waving action, a particular wagging action, or a particular flicking action.
10. The apparatus of any preceding apparatus claim, wherein the apparatus is a portable electronic device, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a server, a non-portable electronic device, a desktop computer, a monitor, a television, an automated teller machine (ATM), a server, a wand, a pointing stick, a touchpad, a touch-screen, a mouse, a joystick or a module/circuitry for one or more of the same.
1 1. A method, the method comprising:
enabling, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
12. A computer program comprising computer program code, the computer program code being configured to perform at least the following:
enable, during a dragging hover gesture interaction, corresponding dragging of one or more selected user interface elements of a graphical user interface, the dragging hover gesture interaction being provided within a predetermined hover range of the graphical user interface using a stylus.
PCT/FI2013/050689 2013-06-24 2013-06-24 User interfaces and associated methods for controlling user interface elements WO2014207288A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2013/050689 WO2014207288A1 (en) 2013-06-24 2013-06-24 User interfaces and associated methods for controlling user interface elements

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2013/050689 WO2014207288A1 (en) 2013-06-24 2013-06-24 User interfaces and associated methods for controlling user interface elements

Publications (1)

Publication Number Publication Date
WO2014207288A1 true WO2014207288A1 (en) 2014-12-31

Family

ID=48874326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2013/050689 WO2014207288A1 (en) 2013-06-24 2013-06-24 User interfaces and associated methods for controlling user interface elements

Country Status (1)

Country Link
WO (1) WO2014207288A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018057944A1 (en) * 2016-09-23 2018-03-29 Apple Inc. Devices, methods, and user interfaces for interacting with user interface objects via proximity-based and contact-based inputs
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
WO2023078286A1 (en) * 2021-11-08 2023-05-11 维沃移动通信有限公司 Video display method and apparatus, electronic device, and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090237371A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
EP2530571A1 (en) * 2011-05-31 2012-12-05 Sony Ericsson Mobile Communications AB User equipment and method therein for moving an item on an interactive display
EP2590064A2 (en) * 2011-10-28 2013-05-08 LG Electronics Mobile terminal and method of controlling mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090237371A1 (en) * 2008-03-21 2009-09-24 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
EP2530571A1 (en) * 2011-05-31 2012-12-05 Sony Ericsson Mobile Communications AB User equipment and method therein for moving an item on an interactive display
EP2590064A2 (en) * 2011-10-28 2013-05-08 LG Electronics Mobile terminal and method of controlling mobile terminal

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018057944A1 (en) * 2016-09-23 2018-03-29 Apple Inc. Devices, methods, and user interfaces for interacting with user interface objects via proximity-based and contact-based inputs
CN109643217A (en) * 2016-09-23 2019-04-16 苹果公司 By based on equipment, method and user interface close and interacted based on the input of contact with user interface object
US10318034B1 (en) 2016-09-23 2019-06-11 Apple Inc. Devices, methods, and user interfaces for interacting with user interface objects via proximity-based and contact-based inputs
CN109643217B (en) * 2016-09-23 2020-07-24 苹果公司 Apparatus and method for proximity-based interaction with user interface objects
US10852868B2 (en) 2016-09-23 2020-12-01 Apple Inc. Devices, methods, and user interfaces for interacting with a position indicator within displayed text via proximity-based inputs
US11243627B2 (en) 2016-09-23 2022-02-08 Apple Inc. Devices, methods, and user interfaces for interacting with a position indicator within displayed text via proximity-based inputs
US11644917B2 (en) 2016-09-23 2023-05-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a position indicator within displayed text via proximity-based inputs
US11947751B2 (en) 2016-09-23 2024-04-02 Apple Inc. Devices, methods, and user interfaces for interacting with a position indicator within displayed text via proximity-based inputs
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112394811B (en) * 2019-08-19 2023-12-08 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
WO2023078286A1 (en) * 2021-11-08 2023-05-11 维沃移动通信有限公司 Video display method and apparatus, electronic device, and readable storage medium

Similar Documents

Publication Publication Date Title
US10401964B2 (en) Mobile terminal and method for controlling haptic feedback
US9665177B2 (en) User interfaces and associated methods
EP2946265B1 (en) Portable terminal and method for providing haptic effect to input unit
EP2508972B1 (en) Portable electronic device and method of controlling same
US20130050143A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
KR102091028B1 (en) Method for providing user's interaction using multi hovering gesture
KR102020345B1 (en) The method for constructing a home screen in the terminal having touchscreen and device thereof
KR102010955B1 (en) Method for controlling preview of picture taken in camera and mobile terminal implementing the same
US10205873B2 (en) Electronic device and method for controlling a touch screen of the electronic device
KR102264444B1 (en) Method and apparatus for executing function in electronic device
KR102044826B1 (en) Method for providing function of mouse and terminal implementing the same
US20140368442A1 (en) Apparatus and associated methods for touch user input
KR102080146B1 (en) Operating Method associated with connected Electronic Device with External Display Device and Electronic Device supporting the same
EP3557398A1 (en) Method, apparatus, storage medium, and electronic device of processing split screen display
US9658762B2 (en) Mobile terminal and method for controlling display of object on touch screen
CA2846482A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
EP2770422A2 (en) Method for providing a feedback in response to a user input and a terminal implementing the same
US20160224221A1 (en) Apparatus for enabling displaced effective input and associated methods
US20140168098A1 (en) Apparatus and associated methods
KR20130123696A (en) System and method for providing usability among touch-screen devices
WO2014207288A1 (en) User interfaces and associated methods for controlling user interface elements
KR102118091B1 (en) Mobile apparatus having fuction of pre-action on object and control method thereof
KR101432483B1 (en) Method for controlling a touch screen using control area and terminal using the same
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13740327

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13740327

Country of ref document: EP

Kind code of ref document: A1