WO2012098360A2 - Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées - Google Patents

Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées Download PDF

Info

Publication number
WO2012098360A2
WO2012098360A2 PCT/GB2012/000053 GB2012000053W WO2012098360A2 WO 2012098360 A2 WO2012098360 A2 WO 2012098360A2 GB 2012000053 W GB2012000053 W GB 2012000053W WO 2012098360 A2 WO2012098360 A2 WO 2012098360A2
Authority
WO
WIPO (PCT)
Prior art keywords
gui object
boundary
screen area
screen
user
Prior art date
Application number
PCT/GB2012/000053
Other languages
English (en)
Other versions
WO2012098360A3 (fr
Inventor
Gaetano Vitolo
Diego RUEGA
Justin Buck
Original Assignee
Inq Enterprises Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inq Enterprises Limited filed Critical Inq Enterprises Limited
Publication of WO2012098360A2 publication Critical patent/WO2012098360A2/fr
Publication of WO2012098360A3 publication Critical patent/WO2012098360A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to apparatus, methods and computer programs providing improved user interaction with electronic devices and efficient activation of device operations.
  • touch-sensitive display screens Many mobile telephones, PDAs and other portable electronic devices feature touch-sensitive display screens ("touchscreens") that provide ease of use by enabling intuitive user interaction with the device and avoid the need to reserve a large portion of the device for a separate keyboard.
  • touchscreens are implemented to allow users to select an item by finger touch, and to initiate an operation by finger gestures, avoiding the need for separate data input apparatus.
  • Various technologies have been used to implement touch-sensitivity, including capacitive touch sensors which measure a change in capacitance resulting from the effect of touching the screen, resistive touchscreens that measure a change in electrical current resulting from pressure on the touchscreen reducing the gap between conductive layers, and other technologies.
  • a problem with touchscreens is the increased likelihood of accidentally invoking an operation when pressure is applied to the screen, such as accidentally making calls or switching the handset off.
  • This problem has been solved by 'locking' devices after a period of non-use or in response to an explicit lock request by the user, and then requiring users to carry out a predefined user-interaction as an 'unlock' operation before they use any device functions.
  • the unlock operation may involve a predefined touchscreen interaction, such as an identifiable gesture or using a drag operation to move an element of an interactive GUI object along a fixed path, although a dedicated hardware key for lock/unlock operations is provided on some mobile telephones.
  • Another problem is that, despite the increased screen size of recent mobile telephones compared- with earlier generations of mobile phones, the screen size of many handheld electronic devices constrains the user's interaction with the device.
  • the inventors of the present invention have identified a number of constraints on device users who wish to perform desired operations on an electronic device, such as a touch-enabled mobile telephone, and invented a solution that mitigates the problems. Summary
  • An aspect of the invention provides an electronic device comprising:
  • GUI objects graphical user interface objects displayed within a graphical user interface (GUI) on the display screen, including for detecting selection and user-controlled movement of GUI objects;
  • processing unit is adapted to:
  • each of a first set of applications has an associated GUI object that is user- selectable from an unlock screen of the device, and wherein each of a second set of applications has an associated GUI object that is not user-selectable from the unlock screen and becomes user-selectable following an unlock operation performed by movement across the boundary of a GUI object representing the unlock operation.
  • the invention according to this aspect provides a differentiated lock management for different functions of the device. For example, a camera, calling or typing application can be made easily accessible, whereas other applications such as emails and contact list management can be protected by the lock screen.
  • the display screen and user input mechanism are integrated as a touch- sensitive display screen.
  • a visual representation of the boundary is changed as the GUI object is moved across the boundary.
  • the invoking or unlocking of applications is preferably only performed in response to completion of movement of a respective GUI object across the boundary, wherein said completion comprises release of the GUI object when the movement of said respective GUI object satisfies a predefined boundary-crossing threshold condition.
  • An electronic device comprises:
  • GUI objects graphical user interface objects displayed within a graphical user interface (GUI) on the display screen, including for detecting selection and user-controlled movement of GUI objects;
  • processing unit is adapted to:
  • the processing unit is adapted to change the visual representation of the boundary in response to the movement of the selected GUI object satisfying a predefined threshold condition so as to move the GUI object into the second screen area.
  • the predefined threshold condition may be a threshold position of the GUI object relative to the visual representation of the boundary, or a threshold movement taking account of the speed and direction of the movement.
  • the processing unit is adapted to invoke an operation that is associated with the selected GUI object in response to the movement of the selected GUI object satisfying a predefined threshold condition.
  • the GUI object may be an icon representing an unlock function, and the invoked operation may be an unlock operation.
  • the boundary is controlled to behave as a flexible membrane that stretches in response to movement of the icon.
  • a portion of the boundary in the vicinity of a selected icon is constrained to move with the selected icon, to give the appearance of stretching.
  • the icon moves through the boundary, to give the appearance of the stretched membrane breaking.
  • Another aspect of the invention provides a method for controlling invocation of an operation on an electronic device, comprising: changing the visual representation of a boundary between first and second screen areas in response to a selected GUI object being moved by a user from the first screen area towards or into the second screen area; and invoking an operation in response to the movement of the selected GUI object satisfying a predefined threshold condition.
  • Another aspect of the invention provides a method for controlling the state of a function of an electronic device by movement of a GUI object, which represents that function, from a first screen area of the electronic device to a second screen area, comprising:
  • detecting user selection of a GUI object detecting user interaction with a selected GUI object, and moving the GUI object on the electronic device screen in response to the detected user interaction;
  • a further aspect of the invention provides a method for controlling an operation on an electronic device comprising:
  • GUI graphical user interface
  • the invention according to various of the above-described aspects provides visual feedback to the device user as a GUI object is moved into close proximity with the boundary or through the boundary, indicating how close the GUI object is to a boundary -crossing condition and/or showing when the GUI object has passed through the boundary. This is achieved by the visual representation of the boundary changing in response to the movement of an icon relative to that boundary. This can be implemented to give a clear and intuitive visual indication of progress towards completion of a movement to the second screen area, and potentially also movement of the GUI object back into the first screen area, even when the distances moved by the GUI object are small.
  • the ability to provide clear and intuitive visual feedback in response to small movements of icons on a screen is very helpful to end users.
  • This feedback can help the user to avoid unintended invocations, and give more control in the performance of desired operations.
  • the solution is especially advantageous when used as a consistent mechanism for invoking multiple different operations (e.g. for unlocking or otherwise activating device functions, in response to a user interacting with any of a set of icons displayed on a screen), because it scales much better than known alternatives such as progress bars.
  • the invention provides an error-tolerant activation mechanism, by only invoking an operation when movement of a selected GUI object satisfies a predefined threshold condition and the GUI object is then released.
  • the user may move an icon towards the boundary or partially through the boundary, and they will receive a visual indication that they have done so without any operation being invoked.
  • the user may move an icon completely through the boundary and yet the operation is only invoked if the icon is then released.
  • an operation associated with a GUI object is only invoked if the GUI object is moved and then released when the movement satisfies a predefined condition.
  • the processing unit comprises a data processor and computer program code implementing instructions for controlling the performance of operations by the processor, the computer program code adapting the processing unit to provide a changed visual representation of the boundary in response to the GUI object being moved.
  • the processing unit comprises hardware logic circuits, software-implemented logic or a combination of hardware and software-implemented logic for changing the visual representation of the boundary and invoking an operation.
  • logic for changing the visual representation of the boundary comprises an animation controller that, in response to the selected GUI object being moved within a first defined region proximate the visual representation of the boundary, moves a part of the visual representation of the boundary that is proximate the moved GUI object. This may appear as a flexing of the boundary when a selected icon approaches within a set number of pixels of the boundary. This movement of a part of the boundary ensures that the visual representation of the boundary is not crossed by the GUI object, while the GUI object remains within the first defined region, and the user is provided with a visual feedback of the GUI object's position relative to the boundary.
  • the animation controller logic is responsive to the selected GUI object being moved across the first defined region to a threshold position within the second screen area (e.g. a predefined distance from the original position of the boundary), to change the visual representation of the boundary in a way that shows that the GUI object has moved through the boundary.
  • a selected operation is performed
  • this threshold position is a first threshold distance from the original position of the visual representation of the boundary.
  • another threshold condition of the movement of the GUI object may be the speed of user-controlled movement of an icon across the device's screen (e.g. by a flick gesture applied to an icon) or a combination of position and speed of movement.
  • the behaviour of the boundary varies according to the different functions represented by GUI objects. This can take account of the level of risk or inconvenience to the user associated with an accidental invocation of an operation.
  • a camera icon may pass easily through the membrane as activation of the camera is low risk, whereas activation of an operation to change a phone's settings or to delete data may be given a different response. If the threshold distance that a settings icon must be moved is larger than the distance for a camera, the boundary will appear to stretch further before breaking when a settings icon is moved.
  • Figure 1 is a schematic representation of a mobile telephone, as a first example of an electronic device in which the invention may be implemented;
  • Figure 2 is a representation of the software architecture of an electronic device running the Android operating system
  • Figure 3 is a representation of points in a spatial grid, as used to compute the displacement of the boundary at each sampling instant;
  • Figure 4 is an unlock screen of a mobile telephone, in accordance with an embodiment of the present invention.
  • Figure 5 represents the unlock screen of Figure 3 reacting to a user-implemented gesture
  • Figure 6 is a schematic flow diagram representing a first method according to the invention
  • Figure 7 is a visual representation of the unlock screen when a user interaction is taking place, in accordance with an embodiment of the present invention
  • Figure 8 is a visual representation of the unlock screen just after the user has successfully competed an unlock gesture, according to an embodiment of the present invention
  • Figure 9 is a schematic diagram illustrating a ballistic unlock motion in accordance with an embodiment of the present invention.
  • Figure 10 is a visual representation of a multi -tiered unlock system of an embodiment of the present invention.
  • FIGS 11A and 1 IB are schematic representations of device screens for an embodiment of the invention for controlling an alarm function
  • Figures 12A and 12B are schematic representations of device screens for an embodiment of the invention for controlling call handling on a mobile telephone. Description of Embodiments
  • Touchscreen technology is useful in phones since screen size is limited and touch screen input provides direct manipulation of the items on the display screen such that the area normally required by separate keyboards or numerical keypads is saved and taken up by the touch screen instead.
  • Figure 1 shows an exemplary electronic device.
  • the device in this example is a mobile telephone handset, comprising a wireless communication unit having an antenna 101 and a radio signal transceiver 102 for two-way communications, such as for GSM and UMTS telephony, and a wireless module 103 for other wireless communications such as WiFi.
  • An input unit includes a microphone 104 and a touchscreen 105.
  • An output unit includes a speaker 106 and a display 107 for presenting iconic and/or textual representations of the device's functions.
  • Electronic control circuitry includes amplifiers 108 and a number of dedicated chips providing ADC/DAC signal conversion 109, compression/decompression 1 10, encoding and modulation functions 1 1 1, and circuitry providing connections between these various components, and a microprocessor 1 12 for handling command and control signalling.
  • memory Associated with the specific processors is memory generally shown as memory unit 1 13. Random access memory (in some cases SDRAM) is provided for storing data to be processed, and ROM and Flash memory for storing the phone's operating system and other instructions to be executed by each processor.
  • a power supply 1 14 in the form of a rechargeable battery provides power to the phone's functions.
  • the touchscreen 105 provides both an input mechanism and a display for presenting iconic or textual representations of the phone's functions, and is coupled to the microprocessor 1 12 to enable input via the touchscreen to be interpreted by the processor. There may be a number of separate microprocessors for performing different operations on the electronic device. These features are well known in the art and will not be described in more detail herein.
  • a typical mobile telephone using UMTS telephony also has significant storage capacity within the Universal Integrated Circuit Card (UICC) on which the Universal Subscriber Identity Module (USIM) runs.
  • UICC Universal Integrated Circuit Card
  • USIM Universal Subscriber Identity Module
  • a smaller amount of storage capacity is provided by the telephone handset's UICC (commonly referred to as the SIM card), which stores the user's service- subscriber key (IMSI) that is needed by GSM telephony service providers and handling authentication.
  • the UICC providing the USIM or SIM typically stores the user's phone contacts and can store additional data specified by the user, as well as an identification of the user's permitted services and network information.
  • the software architecture on a mobile telephone using the Android operating system comprises object oriented (Java and some C and C++) applications 200 running on a Java-based application framework 210 and supported by a set of libraries 220 (including Java core libraries 230) and the register-based Dalvik virtual machine
  • the Dalvik Virtual Machine is optimized for resource-constrained devices - i.e. battery powered devices with limited memory and processor speed. Java class files are converted into the compact Dalvik Executable (.dex) format before execution by an instance of the virtual machine.
  • the Dalvik VM relies on the Linux operating system kernel for underlying functionality, such as threading and low level memory management.
  • the Android operating system provides support for touchscreens, GPS navigation, cameras (still and video) and other hardware, as well as including an integral Web browser and graphics support and support for media playback in various formats. Android supports various connectivity technologies
  • CDMA Code Division Multiple Access
  • WiFi Wireless Fidelity
  • UMTS Universal Mobile Telecommunications
  • WiMax Wireless Fidelity
  • SMS text messaging and MMS messaging as well as the Android Cloud to Device Messaging (C2DM) framework.
  • C2DM Android Cloud to Device Messaging
  • Support for media streaming is provided by various plug-ins, and a lightweight relational database (SQLite) provides structured storage management.
  • SQLLite relational database
  • Currently available Android phones include a wide variety of screen sizes, processor types and memory provision, from a large number of manufacturers. Which features of the operating system are exploited depends on the particular mobile device hardware.
  • the Android operating system provides a screen lock feature to prevent accidental initiation of communications and other operations, and applications are typically inaccessible until the user has performed the required unlock operation. This is additional to the pattern of user interactions that can be set up as a security code. As mentioned above, screen lock features have been considered desirable to prevent unintended activation of device functions (starting a call or another application, or switching off the device).
  • a first embodiment of the present invention that is described in detail below provides logic that is responsive to user-controlled movement of GUI objects to change the visual representation of a boundary between a first screen area and a second screen area.
  • the boundary is represented as a flexible membrane.
  • a portion of the 'membrane' that is proximate a GUI object in the first screen area flexes in response to that GUI object being moved towards the second screen area, such that the boundary remains between the GUI object and the second screen area for a range of movement of the GUI object.
  • This provides visual feedback to the user, showing movement to the GUI object relative to the boundary, and a clear indication of which side of the boundary the GUI object is deemed to be on (i.e. clearly showing when the GUI object is currently in the first screen area).
  • Digital hand-held electronic devices have small display screens and relatively small icons or other GUI objects tend to be used to represent selectable functions. In view of the small screen area, it is necessary to take account of small movements of each GUI object. It can be difficult for users to visualise how close a GUI object is to crossing a boundary, resulting in accidental invocations and delays before desired results are achieved.
  • the present invention uses movement of a GUI object through a boundary to invoke certain operations. GUI objects are movable within a first screen area and operations are invoked by moving the GUI object to a second screen area and then releasing the GUI object while in the second screen area. By providing the user with visual feedback on the progress of their movement of the GUI object, it is possible to reduce the likelihood of unintentional invocation of operations.
  • the icon itself may be obscured by the very same user's finger that is controlling the icon's movement. This is especially true in small touch screen devices that rely on finger gestures as an input mechanism, if each icon is smaller that the user's finger-tip.
  • changes to the visual representation of the boundary in response to movement of the icon towards and/or through the boundary can indicate when an icon is close to or has reached the correct position to be released.
  • the boundary is represented visually as a flexible membrane, which stretches in response to movement of an icon, until the movement satisfies the boundary-crossing condition.
  • One or more selectable icons are placed on one side of the membrane.
  • the user In order to unlock the device screen or to unlock a specific device function, the user must drag one of the icons through the membrane.
  • the membrane is stretched as the selected icon moves part way through the membrane, until a threshold boundary-crossing condition is achieved.
  • the breaking point is reached, the visual representation of the boundary changes such that the boundary moves to lie on the other side of the icon.
  • This provides a clearer indication of progress towards completion of a boundary crossing operation than conventional display solutions, and provides a clearer indication of when to release the icon.
  • the behaviour of the icons in response to user-controlled movements can be modelled using equations of classical inertial movement with friction.
  • the movement vector is determined by the speed and direction of the user's swiping finger (or a separate input device) while the finger (or input device) is in contact with the screen, and the icon's subsequent movement is determined by equations representing frictional inertial movement, with an initial velocity equal to the velocity at the moment the finger loses contact with the screen..
  • the inertial movement of the icon can be set to continue such that the icon bounces at the edge of the screen - modelled by equations representing elastic collisions. This provides an intuitive user experience because it corresponds to the user's experience of the movement of real world objects.
  • the inventors of the present invention have developed a solution that mitigates the problem of unintentional invocation while providing clear and intuitive feedback to the user regarding how close an icon is to crossing a boundary between the first and second screen areas.
  • the intuitive visual feedback of the flexible membrane guides the user when intentionally moving an icon across the boundary from the first screen area to the second screen area, and when touching icons without wishing to cross the boundary, giving the user a greater sense of control.
  • the solution is particularly useful for touch screen devices and the embodiment described in detail herein is implemented on a touch screen device.
  • the membrane is modelled as a unidimensional vibrating string with losses.
  • the algorithm that controls the movement of the membrane uses a wave equation with a damping factor.
  • the membrane is constrained by the icon; whenever the icon is dragged and put in contact with a part of the membrane, that part of the membrane will be constrained to follow the icon until the breaking point is reached.
  • the breaking point is passed, the icon will continue its movement according to the physic engine described above and the membrane position will evolve according to the wave equation with damping.
  • FDTD Finite Difference Time Domain
  • Second derivatives are approximated using a central difference approximation.
  • the second derivative for space is determined by the equation:
  • the value of the displacement at t j is only needed for the displacement at the current point in the spatial grid and therefore it does not need to be stored in memory once the current point as been computed.
  • This allows highly efficient processing, which can be implemented using program code instructions, since only two vectors need to be defined: one containing the next positions of the membrane ( Y[i] - g t (t J+i ) ) and one containing the previous positions of the membrane (Z[z] (3 ⁇ 4)).
  • the current membrane position can be computed as:
  • Y[i] F(Y[i),Z[i- ⁇ lZ[ilZ[i +-i))
  • Mobile electronic devices such as mobile telephones usually implement some form of user interface lock mechanism that differentiates between a locked and unlocked condition, in order to prevent the user from inadvertently pressing buttons on their device when it is not in use.
  • an unlock screen is commonly presented to the user before other operations can be performed.
  • the unlock screen is typically presented to the user after awakening the device from a sleep state. In the sleep state, most user interface functionality of the device is disabled, involving for example the screen being turned off and being non- responsive to tactile feedback.
  • the unlock screen facilitates the 'unlocking' of the device through a predefined user interaction, such that upon performing this action (and possibly a subsequent security action) the user regains full control of all functionality of the device.
  • a dedicated hardware key for lock/unlock operation is provided.
  • Locks may be used to prevent various functionalities of the device from being accidentally activated or, when used in conjunction with a security mechanism such as a passcode entry, to prevent unauthorized access to certain functions.
  • a security mechanism such as a passcode entry
  • this unlock screen is divided into two portions: a first screen area 10 and a second screen area 20, divided by a boundary 30 which may comprise a straight horizontal line across the width of the screen.
  • the first screen area contains one or more icons 40 representing functions that the user can work with.
  • the device ignores all user input when in a locked state except unlock requests and interactions with selectable icon or icons in the first screen area of the device.
  • unlock requests can include pressing of dedicated hardware function keys or sequences of key presses, as well as the unlock operation described in detail below.
  • user input is directed to the first screen area to activate a chosen function, examples of which include but are not limited to an unlock function, a camera function activating the device's built-in camera, and a data capture function that presents a data entry field for displaying user typed or spoken information.
  • a chosen function examples of which include but are not limited to an unlock function, a camera function activating the device's built-in camera, and a data capture function that presents a data entry field for displaying user typed or spoken information.
  • the activation of any functions represented by icons in the first screen area involves the user executing a gesture.
  • the gesture may involve movement of a user's finger or another input device while in contact with the touch screen of the device.
  • An example of such a gesture is a user tapping the screen with a finger in the location of an icon.
  • Gestures may incorporate, but are not limited to, discrete touches of the touch screen, continuous motion along the touch screen (e.g. touch-and-drag operations that move an icon) or a combination thereof.
  • the user executes an 'activation gesture' 42 to activate one of the selectable functions represented by icons in the first screen area.
  • This gesture involves touching the screen with an input device or the user's finger in the location of the desired icon 40; while remaining in contact with the screen, moving the icon by a drag operation from the first screen area 10 towards the second screen area 20, and subsequently removing the input device or finger from the screen after the icon has crossed through the boundary 30 and reached the second screen area 20.
  • Selection of an icon causes visual cues designed to assist the user in successfully executing the activation gesture 42 to be presented in the second screen region 20.
  • text 50 is displayed in the second screen region 20, informing the user of the motion required to activate the gesture.
  • the boundary deforms 44 in response to the presence of the desired icon 40, such that the user is able to gauge the degree to which the activation gesture has been completed.
  • cues are merely indicative of the motion the user should execute to complete an activation gesture and do not constrain the path of the gesture.
  • Other types of cue may also be employed, such as providing the user with auditory or tactile feedback indicating the transition through the boundary 30 and/or success or failure of an unlock operation.
  • the icon when an icon is selected by the user, the icon is moved automatically to position its center directly below the point at which the user contacts the screen.
  • the user's view of the icon is at least partially obscured while the icon is being moved but, unlike in prior art solutions, visual feedback is still available to the user via the flexing of the boundary.
  • the logic underlying the gesture recognition and unlock process is shown in Figure 6.
  • the user input gesture 310 is detected 320 and compared to a list of predefined actions 330 stored within the device. If this comparison operation 340 results in the finding that the user input gesture 310 corresponds to the predefined general unlock gesture then the screen is unlocked 350, restoring full functionality to the device.
  • comparison operation 340 results in the finding that the user input gesture does not correspond to the predefined unlock gesture
  • a second comparison operation 360 is invoked in order to determine if the user input gesture 310 matches one of a plurality of predefined function specific unlock gestures. If such a match is found, the device is unlocked and the function to which the identified gesture corresponds is invoked 380. Finally if no match is found by either comparison operation 340 or 360, an unlock reminder 370 is displayed on the screen of the device.
  • Figure 7 illustrates an example of the preferred method of indication, using the example of an unlock icon. That is, the icon represents a general unlock function to enable the user to work with various screen functions. In this example, there are other icons presented in the first screen area, including a camera icon. These other available icons represent functions that do not require a separate unlock operation before they are activated.
  • a selected icon 60 is highlighted on the screen by displaying a coloured bubble 70 containing the icon 60. This creates a larger GUI object than the original icon alone and so makes subsequent user-interaction with the icon easier to visualize.
  • the larger GUI object causes the boundary between the first and second screen areas to be distorted in the region proximate the selected icon 60. Further differentiation from the unselected remaining icons is provided by redrawing the larger GUI object in colour, as opposed to the black and white representation that is used for the GUI objects corresponding to unselected icons.
  • An activation gesture will only activate the corresponding device functionality if completed.
  • the user interface state just after the completion point of an activation gesture is shown in Figure 8. Note that the boundary has sprung back from its stretched condition, to provide behaviour that approximates more closely to the behaviour of a flexible fluid surface membrane.
  • An activation gesture is defined as being completed if it results in an icon fully penetrating the interface 30, such that the extents of the icon are fully within the second screen region 20.
  • the implementation can involve identifying when an outer extremity of an icon reaches a threshold point that is defined to be a specific number of pixels beyond the visual representation of the boundary, such that ' the icon must distort the boundary by a specific distance before the movement of the icon is recognized as completion of an activation event.
  • the choice between different implementations may vary according to the device type, to ensure a suitable visual feedback to users regardless of the device type.
  • the icon that the user attempted to activate returns to the first screen area. This is implemented as a smooth movement taking a short but finite period of time (typically a fraction of a second).
  • the icon's highlighting e.g. coloured bubble
  • the icon and boundary thus return to their original states 40, 30, (i.e. their states before user selection) over a short time period. The user may then select a different icon, or reselect the same icon.
  • the response of the interface to the motion of an icon during an activation gesture is to flexibly deform around the icon.
  • the extent of deformation is dependant on the extent to which the icon occupies the second screen area 20 (where increased occupation of the second screen area 20 corresponds to increased deformation), in addition to the speed at which the user executes the activation gesture.
  • the user is alerted to the point during an activation gesture at which the selected icon has moved fully into the first screen area by the appearance of the interface, which is redrawn at this stage 46 to appear as a bounding line excluding the selected icon from the second screen area 20.
  • subsequent removal of the user's finger or input device from the touch screen will cause the device functionality corresponding to the selected icon to become activated.
  • the progress towards completion of an activation is presented visually by changing the visual representation of the boundary in other ways).
  • completion of the activation operation results in immediate display of a screen of the activated function, without redrawing the boundary to show a completed state.
  • Another embodiment of the invention involves a setup as depicted in the first embodiment but employs the alternative ('ballistic') activation gesture of Figure 9, comprised of: touching the screen with an appendage or input device 80 in the location of the desired icon 90 and subsequently making a rapid, short flicking motion of the appendage or input device towards the second screen area 20, ending in the appendage or object losing contact with the touch screen of the device.
  • This gesture causes the selected icon to behave as if momentum were imparted to it, causing it to move in a short time period through the boundary and into the second screen area.
  • the motion of the icon is along the trajectory 100, which is defined by the user's motion. If the trajectory is such that that icon contacts the edges of the screen 48, it rebounds elastically and follows an accordingly calculated trajectory 52.
  • the amount of momentum imparted to the icon is a function of the gesture input from the user, with a threshold momentum value being set, above which the icon is able to cross the boundary and move from the first to second screen area. An icon with a momentum of less than this threshold value will not cross the interface and hence remains in the first screen area, such that the icon's corresponding functionality is not activated.
  • the threshold value is chosen such that it is unlikely to be reached without intentional user input, hence allowing the activation gesture to serve as a means for differentiating between accidental and intended user input.
  • an icon with its full extent occupying the first screen area will cause the device functionality corresponding to said icon to become active.
  • This form of user input does not depend on setting a predefined path for the user to follow, as it can take account of gesture recognition including direction, speed and the length of the arc defined between a first touch and cessation of touch of a finger or input device with the touchscreen.
  • the screen area in which the gesture needs be enacted is not predefined by a separate gesture-responsive progress bar, allowing the user to carry out a simple and intuitive gesture to activate a required device function.
  • the user can invoke any of a first set of functions using the method of dragging a respective icon through a boundary from a first screen area to a second screen area, as described in the embodiments above.
  • a first set of selectable icons corresponding to these selectable functions is placed in the first screen region, and the user can activate the respective functions even when the device is in a locked state.
  • This provides the user with a fast route to activating certain functions, circumventing the more usual activation sequence of performing a predefined unlock sequence, locating the application or widget of interest and separately activating the application or widget.
  • access to functionalities that are unnecessarily protected via passcode entry in conventional devices can be provided to users without knowledge of device security codes via the circumvention of the device unlock sequence.
  • a second set of functions require an unlock icon to be moved through the boundary as a first step before the user can work with the second set of functions. This provides a multilevel unlock capability.
  • a tiered unlock functionality is provided such that a plurality of unlock levels offering different user access rights may be assigned to device functions by the device user.
  • two such unlock levels are provided; a limited unlock mode offering limited device functionality but requiring no passcode to complete an unlock operation and a full unlock mode offering full functionality but requiring passcode entry to complete an unlock operation and gain access to said functionality.
  • the logic of such a structure is shown in Figure 10. Secured items 1 10 require passcode entry before allowing the user to access their functionality, where as unsecured items 120 are accessible directly from the unlock screen 230 without passcode entry.
  • a first set of functions is provided via icons in the first screen area, which can be activated by moving an icon into the second screen area.
  • the invention provides a more general selection and feedback mechanism that is not limited to the lock screen example given above.
  • Figures 11 A and 1 IB show the example of an alarm screen on a mobile telephone, with an icon representing an operation to switch off the alarm sound and a second icon representing a "snooze" function (i.e. silence the alarm but repeat in a set period of time).
  • the mechanism of moving an icon through the boundary can be used to switch off the alarm's sound and to activate "snooze", or an alternative simple touch-to-activate mechanism may be used for the "snooze" function.
  • FIG. 1 IA and 11 B Another example is shown in figure 1 IA and 11 B, where a call silence function is selectable via an icon, which is movable from a first to a second screen area and can be released in the second screen area to silence the call.
  • a call silence function is selectable via an icon, which is movable from a first to a second screen area and can be released in the second screen area to silence the call.
  • two additional icons are represented. These correspond to a call accept function and a call reject function, both of which can be activated by dragging the relevant icon through the boundary.
  • This shows that the present invention can be used as a very general activation mechanism, and/or in combination with other activation mechanisms.
  • the invention can also be used as a general switch.
  • a settings page on an electronic device can be provided with icons that are switchable between On' and Off states by moving the icons between screen areas through a boundary.
  • a single boundary can thus be used to visually separate two groups of GUI objects, indicating a different status or state between the two groups.
  • the invention is not limited to a single boundary, and there may be multiple boundaries between multiple screen areas into which GUI objects can be moved.
  • the multiple screen areas may be associated with multiple different levels of protection from accidental invocation.
  • the boundaries can be located at any required position on the screen, either fixed or movable, without being limited to horizontal surfaces. Functions can thus be unlocked on an electronic device by moving GUI objects associated with selectable functions to the current screen location of a target unlock screen area.
  • the boundary crossing condition is evaluated with reference to the current position of the boundary, and th crossing of the boundary is indicated visually to the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte sur un procédé, un appareil et une logique mise en œuvre en logiciel ou en matériel pour répondre à une interaction d'un utilisateur avec des objets d'interface utilisateur graphique (IUG) par modification de l'apparence visuelle d'une frontière de laquelle un objet IUG est proche. Cela fournit à l'utilisateur un retour d'information à mesure que l'objet IUG se rapproche de la frontière et franchit la frontière, ce qui est particulièrement avantageux sur des dispositifs à écran tactile et sur des dispositifs électroniques tels que des téléphones mobiles qui ont de petites surfaces d'écran. Des opérations peuvent être appelées par des gestes qui déplacent un objet IUG d'une première à une seconde zone de l'écran, et le retour visuel d'une frontière qui fléchit réduit la probabilité d'appels involontaires.
PCT/GB2012/000053 2011-01-21 2012-01-20 Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées WO2012098360A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1101137.6 2011-01-21
GB201101137A GB201101137D0 (en) 2011-01-21 2011-01-21 Electronic device and method with improved lock management and user interaction
GB1102586.3 2011-02-14
GB201102586A GB2487440A (en) 2011-01-21 2011-02-14 Changing the boundary between screen areas when a GUI object is moved between the screen areas.

Publications (2)

Publication Number Publication Date
WO2012098360A2 true WO2012098360A2 (fr) 2012-07-26
WO2012098360A3 WO2012098360A3 (fr) 2012-11-29

Family

ID=43769476

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2012/000053 WO2012098360A2 (fr) 2011-01-21 2012-01-20 Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées

Country Status (2)

Country Link
GB (2) GB201101137D0 (fr)
WO (1) WO2012098360A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014126948A1 (fr) * 2013-02-14 2014-08-21 Facebook, Inc. Écran de verrouillage doté d'applications socialisées
CN104346039A (zh) * 2014-08-06 2015-02-11 深圳市金立通信设备有限公司 一种终端

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014060129A1 (fr) * 2012-10-19 2014-04-24 Telefonica Digital España, S.L.U. Procédé et dispositif de déverrouillage de dispositif électronique amélioré
EP2747288A1 (fr) * 2012-12-19 2014-06-25 Siemens Aktiengesellschaft Dispositif de commande et procédé de reconnaissance de la manipulation de manoeuvre de libération
CN103106034A (zh) * 2013-02-05 2013-05-15 中标软件有限公司 一种电子设备及其屏幕或应用的解锁方法及系统

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US7657849B2 (en) * 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20070192719A1 (en) * 2006-02-10 2007-08-16 Microsoft Corporation Hover indicator for objects
KR20070113018A (ko) * 2006-05-24 2007-11-28 엘지전자 주식회사 터치스크린 장치 및 그 실행방법
JP4759743B2 (ja) * 2006-06-06 2011-08-31 国立大学法人 東京大学 オブジェクト表示処理装置、オブジェクト表示処理方法、およびオブジェクト表示処理用プログラム
US8619038B2 (en) * 2007-09-04 2013-12-31 Apple Inc. Editing interface
KR20100010860A (ko) * 2008-07-23 2010-02-02 엘지전자 주식회사 이동 단말기 및 그의 이벤트 제어방법
US8863038B2 (en) * 2008-09-08 2014-10-14 Qualcomm Incorporated Multi-panel electronic device
US8539382B2 (en) * 2009-04-03 2013-09-17 Palm, Inc. Preventing unintentional activation and/or input in an electronic device
US8863007B2 (en) * 2009-04-03 2014-10-14 International Business Machines Corporation Programmatic information transfer
KR101564222B1 (ko) * 2009-05-26 2015-11-06 삼성전자주식회사 휴대단말의 잠금 모드 해제 방법 및 장치
TWI402741B (zh) * 2009-05-27 2013-07-21 Htc Corp 解除螢幕鎖定的方法、行動電子裝置及電腦程式產品

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014126948A1 (fr) * 2013-02-14 2014-08-21 Facebook, Inc. Écran de verrouillage doté d'applications socialisées
US9124545B2 (en) 2013-02-14 2015-09-01 Facebook, Inc. Lock screen with socialized applications
JP2016511900A (ja) * 2013-02-14 2016-04-21 フェイスブック,インク. ソーシャル化されたアプリケーションを備えるロック画面
KR20160089538A (ko) * 2013-02-14 2016-07-27 페이스북, 인크. 소셜형 애플리케이션을 가진 잠금 스크린
US10241645B2 (en) 2013-02-14 2019-03-26 Facebook, Inc. Lock screen with socialized applications
KR102118393B1 (ko) 2013-02-14 2020-06-03 페이스북, 인크. 소셜형 애플리케이션을 가진 잠금 스크린
CN104346039A (zh) * 2014-08-06 2015-02-11 深圳市金立通信设备有限公司 一种终端

Also Published As

Publication number Publication date
GB201102586D0 (en) 2011-03-30
GB201101137D0 (en) 2011-03-09
WO2012098360A3 (fr) 2012-11-29
GB2487440A (en) 2012-07-25

Similar Documents

Publication Publication Date Title
US10162478B2 (en) Delay of display event based on user gaze
US10156980B2 (en) Toggle gesture during drag gesture
US11307758B2 (en) Single contact scaling gesture
TWI536243B (zh) 電子裝置、此電子裝置的控制方法以及電腦程式產品
EP2434388B1 (fr) Dispositif électronique portable et son procédé de commande
US9594504B2 (en) User interface indirect interaction
EP2434387B1 (fr) Dispositif électronique portable et son procédé
US9158399B2 (en) Unlock method and mobile device using the same
KR101598737B1 (ko) 터치 제어 단말 및 터치 제어 언로크 방법 및 장치
US20140152585A1 (en) Scroll jump interface for touchscreen input/output device
US9632694B2 (en) Initiation of actions by a portable computing device from a locked state
WO2014046855A1 (fr) Protection pour des entrées involontaires
JP2012194842A (ja) 情報処理装置、情報処理方法およびプログラム
KR20160027775A (ko) 터치 입력 처리 방법 및 그 장치
WO2012098360A2 (fr) Dispositif électronique et procédé à gestion de verrouillage et interaction de l'utilisateur améliorées
KR20150012396A (ko) 입력 처리 방법 및 그 전자 장치
EP2869167A1 (fr) Dispositif de traitement, procédé et programme de commande de fonctionnement
EP3211510B1 (fr) Dispositif électronique portatif et procédé pour fournir une rétroaction haptique
JP5872111B2 (ja) 画像上でのジェスチャを使用するプログラマブル機器上のアプリケーションの起動
CN102707869B (zh) 电子装置以及其控制方法
EP2741194A1 (fr) Interface de saut de défilement pour dispositif d'écran tactile entrée/sortie
WO2012098361A1 (fr) Appareil et procédé pour interaction d'utilisateur améliorée dans des dispositifs électroniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12704099

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12704099

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 12704099

Country of ref document: EP

Kind code of ref document: A2