GB2524781A - Hidden user interface for a mobile computing device - Google Patents

Hidden user interface for a mobile computing device Download PDF

Info

Publication number
GB2524781A
GB2524781A GB1405942.2A GB201405942A GB2524781A GB 2524781 A GB2524781 A GB 2524781A GB 201405942 A GB201405942 A GB 201405942A GB 2524781 A GB2524781 A GB 2524781A
Authority
GB
United Kingdom
Prior art keywords
mobile computing
computing device
touch
sensitive display
input gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1405942.2A
Other versions
GB201405942D0 (en
Inventor
Mark Hawkins
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1405942.2A priority Critical patent/GB2524781A/en
Publication of GB201405942D0 publication Critical patent/GB201405942D0/en
Publication of GB2524781A publication Critical patent/GB2524781A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

A method for controlling a mobile computer device having a touch-sensitive display which displays a graphical user interface (GUI) comprises; detecting a predetermined input touch gesture being applied to the display, where no visual prompt for the gesture is displayed by the device; and changing a state of the mobile device in response to detecting the gesture. Since no visual prompt or clue about the gesture is displayed, the associated functionality could be taken to be part of a hidden supplementary user interface (HUI) disposed over the GUI. The change of state might comprise executing an application, changing an operating mode, locking the mobile device or switching off/turning on the mobile device. The predetermined input gesture might comprise a drawn shape, a drag, a pinch or a swipe/stroke. The user might provide input which defines the predetermined gesture.

Description

IIIDDEN USER INTERFACE FOR A MOBILE COMPUTING DEVICE
FIELD OF THE INVENTION
[0001] This invention relates to mobile computing device. In particular, the invention relates to the provision of a hidden user interface for a mobile computing device.
BACKGROUND
[0002] This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present techniques, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this
light, and not as admissions of prior art.
[0003] Mobile computing devices, such as personal digital assistants and smartphones, have become increasingly popular and are now commonplace. Mobile computing devices have many uses, from consuming content (e.g., textual and video content) to performing a variety of tasks (e.g., performing a search, composing email, etc.). However, the typical size of mobile computing device provides limited space for displaying content. Further, for touch screen (or touch-sensitive) mobile computing devices, the available display space is even more limited since the content must share the display with controls for interacting with the content, For example, a typical mobile application includes controls, such as buttons and/or menus that allow the user to navigate and manipulate content displayed in the mobile application. However, these controls occupy space that could otherwise be used for displaying content of the mobile application.
[0004] Also, users may find it difficult to perform tasks using a mobile computing device and/or navigate between multiple mobile applications when various buttons and/or menus must be used or navigated in order to perform the task(s). For example, a user may need to navigate between multiple web browsers and/or applications to access desired information or perform a particular task,
SUMMARY
[0005] According to the invention, there is provided a mobile computing device comprising: a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display; and a hidden supplementary user interface adapted to detect a predetermined input gesture applied to the touch-sensitive display and to change a state of the mobile computing device in response to detecting the predetermined input gesture, wherein, in use, no visual prompt as to the hidden supplementary user interface is displayed by the touch-sensitive display at least prior to detecting the predetermined input gesture.
[0006] Embodiments may therefore provide for the implementation of a hidden or invisible user interface in addition to a Graphic User Interface (GUI) (which conventionally displayed by a touch-sensitive display of a mobile computing device). Such a Hidden User Interface (HUI) may be considered to be a supplementary user interface that exists concurrently with the graphic interface of the mobile computing device. The HUT may therefore be visualized as being transparent such that no visual prompt as to the existence of the 1-IUI is displayed to a user of the mobile computing device, thus leaving the visual appearance of the GUI unchanged. The HUI may therefore rely on a user's prior knowledge of its existence or activation.
[0007] In embodiments, the HUt may detect one of a set of predetermined input gestures applied to the touch-sensitive display of a mobile computing device and then change a state of the mobile computing device in response to detecting the predetermined input gesture. For example, the operating mode of a mobile computing device may be changed as a result of detecting a specific user input gesture. The HUI may therefore intercept certain gestures and control or alter the operation of the mobile computing device before they are provided to the operating system of the mobile computing device. Embodiments may therefore provide an invisible control interface disposed above the normal GUll of the mobile computing device.
Such a HUI may therefore act like a filter layer above the graphic user interface, wherein the hidden or invisible user interface is adapted to detect certain input gestures applied to a touch sensitive display. If an input gesture is determined not to be one of a set of predetermined input gestures, the HIJI may simply ignore the input gesture and it is passed to the GUI or operating system of the mobile computing device for normal processing. If, however, an input gesture is determined to be one of a set of predetermined input gestures, the HUI may prevent the gesture from being passed to the GUI or operating system of the mobile computing device (i.e. intercept the input gesture) and instead cause the mobile computing device to perform on or more actions/operations associated with the gesture (that would not have otherwise been performed by the operating system of the mobile computing device).
[0008] The set of predetermined input gestures may comprise one more gestures that are not recognized by the operating system of the mobile computing device. Further, a predetermined input gesture may be defined by a user of the mobile computing device and associated with one or more action or instructions that, when executed, change a state of the mobile computing device.
[0009] Gestures are ways to invoke an action or operation, similar to clicking a toolbar button or typing a keyboard shortcut. Gestures may be performed with a pointing device (including but not limited to a mouse, stylus, hand and/or finger). A gesture typically has a shape, pose or movement associated with it and may comprise a unique physical touch applied to a touch-sensitive surface, Such a gesture may be as simple as a stationary pose or a straight line movement or as complicated as a series of movements or poses. Robust detection of user inputs and/or gestures is therefore a factor that can influence a user's interaction experience with a system.
[00010] In an embodiment, the predetermined input gesture may comprise at least one of: a drawn shape; a drag; a pinch; a swipe; and a stroke.
[00011] The nature and/or parameters of a detected input gesture may be used to define the nature of the action(s) or operation(s) performed by the mobile computing device, In other words, in some examples, the mobile computing device may change state according to the nature and/or parameters of an input gesture detected by the touch-sensitive surface of the device. For example, if a user were to gesture a particular shape, the mobile computing device may change its operating mode in way that is associated with the shape of the gesture, [00012] Thus, in some embodiments, the mobile computing device may have a change in state that reflects the nature of the detected input gesture, The change in state of the mobile computing device may be in a manner that is representative of the input gesture provided by a user of the device or may be as result of performing one or more operations which are defined and associated with the input gesture by a user of the device, [00013] Reference to a mobile computing device having a touch-sensitive display should be understood to refer to any suitable portable computing device having a touch screen or touch-sensitive surface at least partially overlaying or underlying the display of the device.
Current examples of such mobile computing devices may include: a mobile phone; a laptop S computer, a portable tablet computer, a smartphone, a smart TV and a smart watch.
[00014] A change of state may comprise at least one of: executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; unlocking the mobile computing device; locking the mobile computing device; switching off the mobile computing device; waking the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.
[00015] Embodiments may introduce simplified concepts of an invisible control interface that is usable to alter an operating mode of a client device.
[00016] In one embodiment, a mobile computing device may provide a HUll that is invisible to a user and positioned to overlap at least a portion of a touch-sensitive display of the mobile computing device. A user may perform a predetermined gesture on the touch-sensitive display which is detected by the HUI and then used to alter an operating mode of the mobile computing device or an application of the mobile computing device.
[00017] For example, in response to receiving a gesture in the form of a drawn shape, the mobile computing device may change a curent mode of operation of the mobile computing device to a new mode of operation (e.g., from an unlocked mode to a locked mode, or from a powered on mode to a powered off or sleep mode). When switching from the current mode to the new mode of operation, the mobile computing device may disable at least some interaction with content or an object that is displayed by the display of the mobile computing device.
[00018] Upon receipt of a subsequent gesture applied on the disabled object and/or data associated with the disabled object, the client device may apply a predetermined action according to the new operating mode. For example, a gesture that in the browsing mode would have panned or zoomed, in the search mode may be used to identify subject matter to be searched.
[00019] In some embodiments, the client device may activate different modes of operation depending on the nature and/or parameters of the detected gesture. Additionally or alternatively, different gestures may be used to activate different modes of operation or perform different actions/operations.
[00020] The touch-sensitive display may be adapted to receive an input gesture, and the mobile computing device may be adapted to associate the received input gesture with a user-defined action so as to define a gesture that may be detected by the Hut. In other words, embodiments may enable a user to define their own gestures and the operations or actions associated with such gestures.
[00021] According to another aspect of the invention, there is provided a method for controlling a mobile computing device having a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display, the method comprising: providing a hidden supplementary user interface for detecting a predetermined input gesture applied to the touch-sensitive display, wherein, in use, no visual prompt as to the hidden supplementary user interface is displayed by the touch-sensitive display at least prior to detecting a predetermined input gesture applied to the touch-sensitive display; detecting the predetermined gesture applied to the touch-sensitive display; and changing a state of the mobile computing device based on the detected predetermined input gesture.
[00022] The change of state may comprise at least one of executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.
[00023] Embodiments may further comprise the steps of receiving an input gesture applied to the touch-sensitive display; and associating the received input gesture with a user-defined action so as to define the predetermined gesture.
[00024] According to another aspect of the invention, there is provided a computer program product for controlling a mobile computing device having a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display, wherein the computer program product comprises a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configured to perfonn all of the steps of a method according to an embodiment.
[00025] According to another aspect of the invention, there is provided a computer system comprising: a touch-sensitive disp'ay adapted to display a graphic user interface and to control the computer system based on touches sensed by the touch-sensitive display; a computer program product according to an embodiment; and one or more processors adapted to perform all of the steps of a method according to an embodiment.
[00026] In other illustrative embodiments, a computer program product comprising a computer useable or readable medium having a computer readable program may be provided. The computer readable program. when executed on a mobile computing device, may cause the mobile computing device to perform various ones of, and combinations of, the operations outlined above with regard to the method illustrative embodiment.
[00027] The invention may allow a user to control a mobile computing device so as to execute a desired action or operation, wherein the user only needs to apply a gesture to a touch-sensitive display of the device irrespective of what the device may be displaying or doing. For example, by applying a gesture to the touch-sensitive display in the form of a shape, the gesture may be intercepted by the HUI and cause the device to enter a locked mode. Similarly, by applying a gesture to the touch-sensitive display in the form of a different shape, the gesture may be intercepted by the FUJI and cause the device to switch from a first, currently running operating system to a second, different operating system.
Different operations or actions may be associated with different gestures so as to enable to numerous operations or actions to be undertaken without the mobile computing device displaying controls or content related to the different gestures.
[00028] These and other features and advantages of the present invention will be described in, or will become apparent to those of ordinary skill in the art in view of, the following detailed description of the example embodiments of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[00029] Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: [00030] Figure I depicts a pictorial representation of an example distributed system in which aspects of the illustrative embodiments may be implemented; [00031] Figure 2 is a block diagram of an example computing device in which aspects of the illustrative embodiments may be implemented; [00032] Figure 3 depicts a mobile computing device according to an embodiment of the present subject matter; and [00033] FIG. 4 is a flow chart of an example of an implementation of method according to an embodiment of the present subject matter.
DETAILED DESCRIPTION
[00034] The illustrative embodiments provide concepts of an invisible or hidden user interface to supplement a GUI of a mobile computing device. The concepts may enable a user of to change the state or operating mode of a mobile computing device by performing a gesture on a touch-sensitive display of the device, wherein the gesture is detected and intercepted by the HUT. The HUT may then control the mobile computing device to change the state or operating mode of a mobile computing device based on the detected gesture.
[00035] Embodiments of the HUt herein may be used irrespective of any application of a mobile computing device. By way of example and not limitation, the application may include, but is not limited to, an operating system (e.g., Windows Mobile ®, Android ®, iOS®, etc.) of the client device, a software program (such as a web browser application, a search application, a video player application, a music player application, an email client, a calendar application, a word processing application, a spreadsheet application, a photo viewing and/or editing application, a game, etc.), etc. [00036] In some embodiments, the user may want to manipulate or interact with an application or data (for example, content displayed in the application and/or metadata such as historical user data in one or more past sessions, etc.) associated with the application using the Hut. In one embodiment, the user may do so by applying a gesture to a predetermined region of the touch-sensitive display of the device. By way of example and not limitation, the predetermined region may include all or part of a touch-sensitive display of the device.
[00037] In one embodiment, the selection gesture may include, for example, using a pointing device, such as a mouse, a stylus or a finger, etc., to press and hold the predetermined region of the touch-sensitive display, tap the predetermined region of the touch-sensitive display a predetermined number of times within a predetermined time period (e.g., two times within one second), swipe up or down, swipe up and down in quick succession along the predetermined region of the touch-sensitive display, move along the predetermined region of the touch-sensitive display in a clockwise or anticlockwise direction, However, these gestures are merely illustrative, and any other desired gesture may be adapted to be detected by the Hut. For example, in some embodiments, a "call person X" gesture may include a motion of a finger drawing a particular alphanumeric character on the touch-sensitive screen, Moreover, the gestures may include single touch gestures (using a single pointing device) or multi-touch gestures (using multiple pointing devices or points of content).
[00038] In response to receiving or detecting a predetermined gesture, the HUT may activate a predetermined action associated with the gesture. The predetermined action may include, but is not limited to, an operation that is not recognized or executable by the application(s) currently running or being used on the mobile computing device. For example, the predetermined action may include disabling interaction with the application or the content of the application, changing a current mode of operation of the application to a new mode of operation, performing one or more operations on the application and/or the content of the application, performing one or more operations on a different application from a currently running application, etc. [00039] The predetermined action associated with a gesture may be predefined or preprogrammed by a developer of the application, a content provider that serves content of the application, and/or the user of the mobile computing device. Additionally or alternatively, embodiments may provide a user interface for the user to define a gesture and select an action to associate with the user-defined gesture.
[00040] Embodiments described herein may enable a mobile computing device to provide a user interface that does not occupy display space and/or may only be known to exist by a user due to their prior knowledge of its existence or activation on the mobile computing device. Embodiments may therefore provide a concept for enabling an owner of a mobile computing device to change an operating mode of the device using a predetermined gesture, and this ability to do so may not be known by another person (due to the Hill not providing any visual prompt or indication of its existence on the device). Such a concept may provide for improved security.
[00041] Illustrative embodiments may be utilized in many different types of processing environments, In order to provide a context for the description of elements and functionality of the illustrative embodiments, Figures 1 and 2 are provided hereafter as example environments in which aspects of the illustrative embodiments may be implemented. It should be appreciated that Figures 1 and 2 are only examples and are not intended to assert or imply any limitation with regard to the environments in which aspects or embodiments of the present invention may be implemented. Many modifications to the depicted environments may be made without departing from the scope of the present invention, [00042] Figure 1 depicts a pictorial representation of an example distributed data processing system in which aspects of the illustrative embodiments may be implemented.
Distributed data processing system 100 may include a network of computing devices in which aspects of the illustrative embodiments may be implemented, The distributed data processing system 100 contains at least one network 102, which is the medium used to provide communication links between various devices and computers connected together within distributed data processing system 100, The network 102 may include connections, such as wire, wireless communication links, or fiber optic cables, [00043] In the depicted example, first 104 and second 106 servers are connected to the network 102 along with a storage unit 108, In addition, mobile computing devices LID, 112, and 114 are also connected to the network 102. The mobile computing devices 110, 112, and 114 may be, for example, laptop personal computers, tablet computers, smartphones, or the like. In the depicted example, the first server 104 provides data and applications to the mobile computing devices 110, 112, and 114. Mobile computing devices 110, 112, and 114 S are clients to the first server 04 in the depicted example. The distributed data processing system 100 may include additional servers, clients, and other devices not shown.
[00044] Mobile computing devices 110, 112, and 114 are provided with a HUT 116 usable to implement an invisible control concept. Each HUI 116 is shown as a dashed oval shape in the center of the display screen of a mobile computing device for illustration purposes only.
In practice, the HUI 116 would not be visible to a user and may be disposed over the entire display, a portion of the display, or at another location on the display screen.
[00045] In the depicted example, the distributed data processing system 100 is the Internet with the network 102 representing a worldwide collection of networks and gateways that use the Transmission Control Protocolllnternet Protocol (TCP/IP) suite of protocols to communicate with one another, At the heart of the Internet is a backbone of high-speed data communication lines between major nodes or host computers, consisting of thousands of commercial, governmental, educational and other computer systems that route data and messages. Of course, the distributed data processing system 100 may also be implemented to include a number of different types of networks, such as for example, an intranet, a local area network (LAN), a wide area network (WAN), or the like, As stated above, Figure 1 is intended as an example, not as an architectural limitation for different embodiments of the present invention, and therefore, the particular elements shown in Figure 1 should not be considered limiting with regard to the environments in which the illustrative embodiments of the present invention may be implemented.
[00046] Figure 2 is a block diagram of an example mobile computing device 200 in which aspects of the illustrative embodiments may be implemented. Here, mobile computing device 200 is a laptop computer and thus may be considered to be an example of a mobile computing device in Figure 1, in which computer usable code or instructions implementing the processes for illustrative embodiments of the present invention may be located, [00047] In the depicted example, the data processing system 200 employs a hub architecture including a north bridge and memory controller hub (NB/MCH) 202 and a south bridge and input/output (110) controller hub (SB/ICH) 204. A processing unit 206, a main memory 208, and a graphics processor 210 are connected to NB/MCH 202. The graphics processor 210 may be connected to the NB/MCH 202 through an accelerated graphics port (AGP).
[00048] In the depicted example, a local area network (LAN) adapter 212 connects to SB/ICH 204. An audio adapter 216, a touch-sensitive display 220, a modem 222, a read only memory (ROM) 224, a hard disk drive (HDD) 226, a CD-ROM drive 230, a universal serial bus (USB) ports and other communication ports 232, and PCI/PCIe devices 234 connect to the SB/ICR 204 through first bus 238 and second bus 240. PCI/PCIe devices may include, for example, Ethernet adapters, add-in cards, and PC cards for notebook computers.
PCI uses a card bus controller, while PCIe does not. ROM 224 may be, for example, a flash basic input/output system (BIOS).
[00049] The HDD 226 and CD-ROM drive 230 connect to the SB/ICH 204 through second bus 240. The HDD 226 and CD-ROM drive 230 may use, for example, an integrated drive electronics (IDE) or serial advanced technology attachment (SATA) interface, Super (SlO) device 236 may be connected to SB/ICH 204.
[00050] An operating system runs on the processing unit 206. The operating system coordinates and provides control of various components within the mobile computing device in Figure 2. The operating system may be a commercially available operating system.
An object-oriented programming system, such as the JavaTM programming system, may run in conjunction with the operating system and provides calls to the operating system from JavaTM programs or applications executing on mobile computing device 200.
[00051] The mobile computing device 200 may be a symmetric multiprocessor (SMP) system induding a plurality of processors in processing unit 206. Alternatively, a single processor system may be employed.
[00052] Instructions for the operating system, the object-oriented programming system, and applications or programs are located on storage devices, such as HDD 226, and may be loaded into main memory 208 for execution by processing unit 206. The processes for illustrative embodiments of the present invention may be performed by processing unit 206 using computer usable program code, which may be located in a memory such as, for example, main memory 208, ROM 224, or in one or more peripheral devices 226 and 230,
for example.
[00053] A bus system, such as first bus 238 or second bus 240 as shown in Figure 2, may be comprised of one or more buses. Of course, the bus system may be implemented using any type of communication fabric or architecture that provides for a transfer of data between different components or devices attached to the fabric or architecture. A communication unit, such as the modem 222 or the network adapter 212 of Figure 2, may include one or more devices used to transmit and receive data. A memory may be, for example, main memory 208, ROM 224, or a cache such as found in NB/MCI-I 202 in Figure 2.
[00054] Those of ordinary skill in the art will appreciate that the hardware in Figures 1 and 2 may vary depending on the implementation. Other internal hardware or peripheral devices, such as flash memory, equivalent non-volatile memory, or optical disk drives and the like, may be used in addition to or in place of the hardware depicted in Figures t and 2.
Also, the processes of the illustrative embodiments may be applied to a multiprocessor data processing system, other than the SMP system mentioned previously, without departing from the scope of the present invention.
[00055] Moreover, the mobile computing device 200 may take the form of any of a number of different mobile computing devices including a tablet computer, a mobile phone, a smart watch, a personal digital assistant (PDA), a smartwatch or the like, In some illustrative examples, the mobile computing device 200 may be configured with flash memory to provide non-volatile memory for storing operating system files andlor user-generated data, for example. Thus, the mobile computing device 200 may essentially be any known or later-developed data processing system without architectural limitation.
[00056] The proposed invention enhances a mobile computing device (such as that depicted in Figure 2) by providing for a HUt which is supplementary to a GUI of the mobile computing device. Embodiments may intercept steps undertaken in a mobile computing device which may result in changing a state or mode of operation of the mobile computing device that would not otherwise happen under normal control of the operating system via the GUI.
[00057] In one embodiment, the HUT 116 may be integrated with the mobile computing device 200. By way of example and not limitation, some or all of the Hill may be induded in the mobile computing device 200, for example, as software and/or hardware installed in the mobile computing device 200. In other embodiments, the mobile computing device 200 S and the invisible control system 116 may be separate software and/or hardware systems.
[00058] A description of a preferred implementation of an embodiment now follows. A simplified diagram of such an embodiment is shown in Figure 3.
[00059] The mobile computing device 300 comprises a touch-sensitive display 302 adapted to display content of a GUI 304A, Based on touches sensed by the touch-sensitive display 302 the mobile computing device 300 may be controlled.
[00060] A HUt 306 is provided in manner which is invisible to a user of the device 300 (at least prior to the detecting a predetermined input gesture applied to the touch-sensitive display 302). For the purpose of illustration, however, the Hill 306 is depicted in Figure 3 using a dashed (transparent) rectangular shape the overlays the touch-sensitive display. It should be understood that such depiction of the HUT 306 is purely for illustrative purposes to aid understanding of how the HUT 306 may be visualized in a conceptual sense.
[00061] The mobile computing device 300 also comprises a microphone 308 and a processing unit 310. The processing unit 310 is adapted to execute instructions that control and/or operate the mobile computing device 300.
[00062] The HUT 306 is adapted to detect a predetermined input gesture applied to the touch-sensitive display 302 and to change a state of the mobile computing device 300 based on the detected predetermined input gesture. By way of example, the predetermined input gesture may comprise at least one of: a drawn shape; a drag; a pinch; a swipe; and a stroke.
[00063] No visual prompt as to the existence of the HUL is displayed by the touch-sensitive display, at least prior to detecting the predetermined input gesture. Thus, the HUI 306 may be considered to implement a supplementary interface layer above the GUT 304 which is adapted to detect one or more predetermined input gestures applied to the touch-sensitive display 302. When such a predetermined input gesture is detected by the HUt 306, the HUT 306 intercepts the gesture (e.g. prevents the gesture from being interpreted in relation to, or in combination with, the GUI 304) and determines an action to be performed based on the detected/intercepted gesture. The HUI 306 then performs the determined action (e.g. by providing appropriate instructions to the processing unit 310) so as to change a state of the mobile computing device 300.
[00064] Here, by way of example, a change of state may comprise at least one of: executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device 300; changing an operating mode of the mobile computing device 300 and/or a software application; locking the mobile computing device 300; switching off the mobile computing device 300; turning on the mobile computing device 300; modifying content displayed by the touch-sensitive display 302; and enabling or disabling interaction with at least a portion of content 304 displayed by the touch-sensitive display 302.
[00065] Generally, a user may use the mobile computing device 300 or an application of the mobile computing device 300 to consume content. The content may include text, images, video, aM/or audio. The user may apply a gesture on a predetermined region of the touch-sensitive display 302 to implement invisible control. In the illustrated embodiment of Figure 3, the predetermined region extends over the entire area touch-sensitive display 302.
However, in other embodiments, the predetermined region may include, but is not limited to, a portion of the touch-sensitive display 302, or all or part of a border or an edge of the touch-sensitive display 302.
[00066] Prior to detecting or receiving a gesture, the HUT 306 may not provide any indication or visual prompt to the user that the HUT 306 is present. However, it will be appreciated that the user may be required to have knowledge about the presence of the HUI 306 and so the user may be educated as to the presence of the HUI 306, for example when the device is used for the first time, by periodically providing hints or suggestions, by briefly showing a visual representation of the HUT 306 (e.g., at startup of the device 300 and!or periodically thereafter), and/or by description in documentation associated with the device, etc. Such temporary or prior indications will be understood as not being made in use' and so should not be considered to be being visual prompts in the general sense. Instead, such temporary or prior indications may only be displayed for a very small (almost zero) fraction of time relative to the total operating time of the device 300.
[00067] Additionally or alternatively, the mobile computing device 300 may provide an indication to the user in response to the FUJI 306 detecting a predetermined gesture. That is, once a user applies a predetermined gesture to the touch sensitive display 302, the mobile computing device 300 may illuminate a light or otherwise indicate to a user that the HUI 306 S has detected, or is in the process of detecting, the predetermined gesture. The mobile computing device 300 may keep the indication hidden or invisible to the user if no predetermined selection gesture is detected and/or alier the selection gesture is removed from the I-lU! 306, for example.
[00068] In one embodiment, in response to detecting the selection gesture on the predetermined region of the HUt 306, the touch-sensitive display 302 may provide information about the HUI 306 to the user 102. For example, in response to detecting or receiving a predetermined gesture, the touch-sensitive display may provide an acknowledgement to the user that the user has activated an invisible control. The acknowledgement may include, for example, displaying a visible indicator (such as a visible line, border, etc.), changing a color of a displayed graphic (such as an icon, a button, etc.) associated with the object, illuminating a graphic associated with the ob.ject, changing a color of a frame associated with the object, and/or playing a predetermined audio signal, etc. [00069] In some embodiments, different locations or areas of the HUI 306 may be associated with different predetermined actions. In other embodiments, some locations or sides of the HUI 306 may be associated with a same predetermined action, In other embodiments, some locations or sides of the RU! 306 may be associated with a same predetermined action but with different magnitudes (such as fast forwarding, slow forwarding, normal playing a video, for example).
[00070] FIG. 4 is a flow chart of an example of an implementation of method according to an embodiment of the present subject matter, [00071] FIG. 4 shows the steps of a method 400 carried out by a mobile computing device, such as the mobile computing device in FIG. 2 or 3, according to one example of the present subject matter.
[00072] The method begins in step 402 and proceeds to step 404 wherein a HUT is provided (for example, by execution and/or arrangement of appropriate software and/or hardware). The HUT is for detecting a predetermined input gesture applied to the touch-sensitive display of the mobile computing device. For example, such a predetermined input gesture may comprise at least one of: a drawn shape; a drag; a pinch; a swipe; and a stroke.
[00073] In use (for example, when the mobile computing device is being used by a user), no visual prompt of the HUL is displayed by the touch-sensitive display of the mobile computing device prior to detecting the predetermined input gesture applied to the touch-sensitive display. Thus, in essence, the HUI is arranged to be invisible to a user of the device (at least until detecting the predetermined input gesture applied to the touch-sensitive display) such that it relies on the user's prior knowledge of its existence or activation.
[00074] Next, in step 406, the HUT detects if and when the predetermined gesture has been applied to the touch-sensitive display. In response to detecting the predetermined gesture, the method proceeds to step 408 wherein a state of the mobile computing device is changed based on the detected predetermined input gesture. By way of example, a change of state may comprise at least one of executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.
[00075] The method then proceeds to step 410, wherein it is determined whether or not provision of the HUI is to be continued. If, in step 410, it is determined that provision of the HUT is to be stopped, the method proceeds to step 412 when it ends and the HUI is stopped.
If, in step 410, it is determined that the HUI is to be maintained, the method returns to wait for the detection of the predetermined gesture in step 406.
[00076] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in any one or more computer readable medium(s) having computer usable program code embodied thereon.
[00077] Any combination of one or more computer readable medium(s) may be utilized.
The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more res, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CDROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device, [00078] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in a baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[00079] Computer code embodied on a computer readable medium may be transmifted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio frequency (RE), etc., or any suitable combination thereof [00080] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as J'avaTM, SmalltalkTM, C++, or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's mobile computing device, partly on the user's mobile computing device, as a stand-alone software package, partly on the user's mobile computing device and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a S local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[000811 Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the illustrative embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be imp'emented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[00082] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions that implement the function/act specified in the flowchart and/or block diagram block or blocks.
[00083] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus, or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[00084] It will be clear to one of ordinary skill in the art that all or part of the method of one embodiment of the present invention may suitably and usefully be embodied in a logic apparatus, or a plurality of logic apparatus, comprising logic elements arranged to perform the steps of the method and that such logic elements may comprise hardware components, firmware components or a combination thereof [00085] It will be equally clear to one of ordinary skill in the art that all or part of a logic arrangement according to one embodiment of the present invention may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the method, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit. Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
[00086] It will be appreciated that the method and arrangement described above may also suitably be carried out fully or partially in software running on one or more processors (not shown in the figures), and that the software may be provided in the form of one or more computer program elements carried on any suitable data-carrier (also not shown in the figures) such as a magnetic or optical disk or the like. Channels for the transmission of data may likewise comprise storage media of all descriptions as well as signal-carrying media, such as wired or wireless signal-carrying media.
[00087] A method is generally conceived tobe a self-consistent sequence of steps leading to a desired result. These steps require physical manipulations of physical quantities.
Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, parameters, items, elements, objects, symbols, characters, terms, numbers, or the like, It should be noted, however, that all of these terms and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
[00088] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instmctions for implementing the specified logical fUnction(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the S blocks may sometimes be executed in the reverse order, depending upon the fUnctionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[00089] A data processing system suitable for storing and/or executing program code will include at east one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
[00090] The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiment was chosen and described in order to best explain the principles of the invention, the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (12)

  1. CLAIMS1 A mobile computing device comprising: a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display; and a hidden supplementary user interface adapted to detect a predetermined input gesture applied to the touch-sensitive display and to change a state of the mobile computing device based on the detected predetermined input gesture, wherein, in use, no visual prompt as to the hidden supplementary user interface is displayed by the touch-sensitive display at least prior to detecting the predetermined input gesture.
  2. 2. The mobile computing device of claim 1, wherein the change of state comprises at least one of executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.
  3. 3. The mobile computing device of any preceding claim, wherein the predetermined input gesture comprises at least one of a drawn shape; a drag; a pinch; a swipe; and a stroke.
  4. 4, The mobile computing device of any preceding claim, wherein the touch-sensitive display is adapted to receive an input gesture and wherein the mobile computing device is adapted to associate the received input gesture with a user-defined action so as to define the predetermined gesture.
  5. 5. The mobile computing device of any preceding claim, wherein the mobile computing devices comprises: a mobile phone; a laptop computer; a tablet computer; a personal digital assistant; or smartphone.
  6. 6. A method for controlling a mobile computing device having a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display, the method comprising: providing a hidden supplementary user interface for detecting a predetermined input gesture applied to the touch-sensitive display, wherein, in use, no visual prompt as to the hidden supplementary user interface is displayed by the touch-sensitive display at least prior to detecting a predetermined input gesture applied to the touch-sensitive display; detecting the predetermined gesture applied to the touch-sensitive display; and changing a state of the mobile computing device based on the detected predetermined input gesture.
  7. 7. The method of claim 6, wherein the change of state comprises at least one of: executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.
  8. 8. The method of claim 6 or 7, wherein the predetermined input gesture comprises at least one of: a drawn shape; a drag; a pinch; a swipe; and a stroke.
  9. 9, The method of any of claims 6 to 8, further comprising the steps of: receiving an input gesture applied to the touch-sensitive display; and associating the received input gesture with a user-defined action so as to define the predetermined gesture.
  10. 10. A computer program product for controlling a mobile computing device having a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display, wherein the computer program product comprises a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configured to perform all of the steps of any of claims 5 to 8.S 11 A computer system comprising: a computer program product according to claim 10; and one or more processors adapted to perform all of the steps of any of claims 6 to 9.I 2. A mobile computing device substantially as herein described above with reference to the accompanying figures.AMENDMENTS TO THE CLAIMS HAVE BEEN FILED AS FOLLOWSCLAIMS1 A mobile computing device comprising: a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display; and a hidden supplementary user interface adapted to: detect an input gesture applied to the touch-sensitive display; determine whether the input gesture is one of a set of predetermined input gestures; and either: change a state of the mobile computing device based on the detected input gesture when the said input gesture is one of the set of predetermined input gestures, or pass the input gesture to the graphic user interface when the said input IC) is gesture is not one of the set of predetermined input gestures, wherein, in use, no visual prompt as to the hidden supplementary user interface is (0 displayed by the touch-sensitive display at least prior to detecting the input gesture.2. The mobile computing device of claim 1, wherein the change of state comprises at least one of: executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.3. The mobile computing device of any preceding claim, wherein the set of predetermined input gestures comprises at least one of: a drawn shape; a drag; a pinch; a swipe; and a stroke.4. The mobile computing device of any preceding claim, wherein the touch-sensitive display is adapted to receive an input gesture and wherein the mobile computing device is adapted to associate the received input gesture with a user-defined action so as to define the predetermined gesture.5, The mobile computing device of any preceding claim, wherein the mobile computing devices comprises: a mobile phone; a laptop computer; a tablet computer; a personal digital assistant; or smartphone.6. A method for controlling a mobile computing device having a touch-sensitive display adapted to display a graphic user interface and to control the mobile computing device based on touches sensed by the touch-sensitive display, the method comprising: providing a hidden supplementary user interface for detecting an input gesture applied to the touch-sensitive display, wherein, in use, no visual prompt as to the hidden supplementary user interface is displayed by the touch-sensitive display at least prior to IC) is detecting an input gesture applied to the touch-sensitive display; detecting the input gesture applied to the touch-sensitive display; (0 determining whether the input gesture is one of a set of predetermined input gestures; changing a state of the mobile computing device when the said input gesture is determined to be one of the set of predetenmned input gestures; and passing the input gesture to the graphic user interface when the said input gesture is determined to not be one of the set of predetermined input gestures.7. The method of claim 6, wherein the change of state comprises at least one of: executing a sequence of operations associated with the predetermined input gesture; executing a software application on the mobile computing device; changing an operating mode of the mobile computing device and/or a software application; locking the mobile computing device; switching off the mobile computing device; turning on the mobile computing device; modifying content displayed by the touch-sensitive display; and enabling or disabling interaction with at least a portion of content displayed by the touch-sensitive display.8. The method of claim 6 or 7, wherein the predetermined input gesture comprises at least one of a drawn shape; a drag; a pinch; a swipe; and a stroke.9, The method of any of claims 6 to 8, further comprising the steps of: s receiving an input gesture applied to the touch-sensitive display; and associating the received input gesture with a user-defined action so as to define the predetermined gesture.10. A computer program product for controlling a mobile computing device having a touch-sensitive display adapted to disp'ay a graphic user interface and to contr6l the mobile computing device based on touches sensed by the touch-sensitive display, wherein the computer program product comprises a computer-readable storage medium having computer-readable program code embodied therewith, the computer-readable program code configured to perform all of the steps of any of claims 6 to 9.IC) is
  11. 11. A computer system comprising: a computer program product according to claim (0 10; and one or more processors adapted to perform all of the steps of any of claims 6 to 9.
  12. 12. A mobile computing device substantially as herein described above with reference to the accompanying figures.
GB1405942.2A 2014-04-02 2014-04-02 Hidden user interface for a mobile computing device Withdrawn GB2524781A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1405942.2A GB2524781A (en) 2014-04-02 2014-04-02 Hidden user interface for a mobile computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1405942.2A GB2524781A (en) 2014-04-02 2014-04-02 Hidden user interface for a mobile computing device

Publications (2)

Publication Number Publication Date
GB201405942D0 GB201405942D0 (en) 2014-05-14
GB2524781A true GB2524781A (en) 2015-10-07

Family

ID=50737888

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1405942.2A Withdrawn GB2524781A (en) 2014-04-02 2014-04-02 Hidden user interface for a mobile computing device

Country Status (1)

Country Link
GB (1) GB2524781A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160004393A1 (en) * 2014-07-01 2016-01-07 Google Inc. Wearable device user interface control

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2486707A (en) * 2010-12-21 2012-06-27 Sharp Kk A touch screen handset with a combined application launcher and touch screen unlock mechanism.
US20120182226A1 (en) * 2011-01-18 2012-07-19 Nokia Corporation Method and apparatus for providing a multi-stage device transition mechanism initiated based on a touch gesture
US20120304111A1 (en) * 2011-03-11 2012-11-29 Google Inc. Automatically hiding controls
US20140033136A1 (en) * 2012-07-25 2014-01-30 Luke St. Clair Custom Gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2486707A (en) * 2010-12-21 2012-06-27 Sharp Kk A touch screen handset with a combined application launcher and touch screen unlock mechanism.
US20120182226A1 (en) * 2011-01-18 2012-07-19 Nokia Corporation Method and apparatus for providing a multi-stage device transition mechanism initiated based on a touch gesture
US20120304111A1 (en) * 2011-03-11 2012-11-29 Google Inc. Automatically hiding controls
US20140033136A1 (en) * 2012-07-25 2014-01-30 Luke St. Clair Custom Gestures

Also Published As

Publication number Publication date
GB201405942D0 (en) 2014-05-14

Similar Documents

Publication Publication Date Title
US8413075B2 (en) Gesture movies
KR102027612B1 (en) Thumbnail-image selection of applications
US9658766B2 (en) Edge gesture
EP2715491B1 (en) Edge gesture
US9152529B2 (en) Systems and methods for dynamically altering a user interface based on user interface actions
US9013366B2 (en) Display environment for a plurality of display devices
US10528252B2 (en) Key combinations toolbar
US20150378600A1 (en) Context menu utilizing a context indicator and floating menu bar
US20120304131A1 (en) Edge gesture
US10182141B2 (en) Apparatus and method for providing transitions between screens
US20140143688A1 (en) Enhanced navigation for touch-surface device
EP3161598A1 (en) Light dismiss manager
US10019148B2 (en) Method and apparatus for controlling virtual screen
US20160124618A1 (en) Managing content displayed on a touch screen enabled device
RU2600544C2 (en) Navigation user interface in support of page-focused, touch- or gesture-based browsing experience
JP2014106625A (en) Portable terminal, control method of portable terminal, program and recording medium
US9823890B1 (en) Modifiable bezel for media device
US20150205513A1 (en) Using a scroll bar in a multiple panel user interface
US9588661B1 (en) Graphical user interface widget to select multiple items from a fixed domain
GB2524781A (en) Hidden user interface for a mobile computing device
EP3210101B1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
US9916064B2 (en) System and method for toggle interface
CN107180039A (en) A kind of text information recognition methods and device based on picture
US20190129576A1 (en) Processing of corresponding menu items in response to receiving selection of an item from the respective menu
US20180329610A1 (en) Object Selection Mode

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)