US20160378279A1 - System and methods for device control - Google Patents
System and methods for device control Download PDFInfo
- Publication number
- US20160378279A1 US20160378279A1 US15/169,642 US201615169642A US2016378279A1 US 20160378279 A1 US20160378279 A1 US 20160378279A1 US 201615169642 A US201615169642 A US 201615169642A US 2016378279 A1 US2016378279 A1 US 2016378279A1
- Authority
- US
- United States
- Prior art keywords
- display
- command
- touch command
- notification window
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/31—Indexing; Data structures therefor; Storage structures
- G06F16/316—Indexing structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/3827—Portable transceivers
- H04B1/3833—Hand-held transceivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
- H04L67/142—Managing session states for stateless protocols; Signalling session states; State transitions; Keeping-state mechanisms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present disclosure relates to electronic devices, and more particularly to device control.
- Mobile devices and personal communication devices are generally used for multiple purposes. These devices are often configured with particular forms of control, such as the inclusion of hard and soft buttons. With development of applications and device capabilities, there exists a need for device configurations that improve performance and resolve drawbacks of the conventional configurations. One area where improvements are needed is for device control configurations.
- devices may require selection of a particular element.
- the display element is difficult to select based on the size of a device, the display location of the element, a requirement for scrolling to the element, or even a requirement to select a button of the device.
- elements may be hard to select due to configuration of content not intended for display on a device, such as non-mobile network sites designed for computer viewing.
- Selection prompts are common. However, conventional methods of device control relative to the selection prompt is limited. With some configurations, it may be difficult to close or dismiss the selection prompt. There is a desire to provide additional control functionality for devices, and for controllability with respect to presented selection prompts.
- One embodiment is directed to a method for device control includes displaying, by a device, a user interface including one or more graphical elements on a display of the device, presenting, by the device, a notification window in the user interface as an overlay to the one or more graphical elements, wherein the notification window is displayed to include one or more options to control operation of the device, and detecting, by the device, a touch command relative to the display, wherein the touch command is a directional command, and wherein a direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window.
- the method also includes controlling, by the device, operation of the device based on the direction of the touch command.
- the user interface includes presentation of at least one of an application and device control screen.
- the notification window includes a confirm selection and a cancel selection.
- the touch command is associated with position of display for the notification window.
- the touch command is non-overlapping to the position of display for the notification window.
- the touch command is a first of two opposing directions for device control.
- controlling operation of the device includes closing the display window from presentation when a swipe command is detected in a first direction.
- controlling operation of the device includes selecting an option to proceed with an operation indicated in the display window for swipe command detected in a direction assigned to confirm.
- the method also includes presenting a follow-up notification window based on the direction of the touch command.
- the method includes updating the display to transition the notification window out from the user interface and presenting a graphical representation of the selected command during the updating.
- Another embodiment is directed to a device including a display and a controller coupled to the display.
- the controller is configured to control display of a user interface including one or more graphical elements on the display, present a notification window in the user interface as an overlay to the one or more graphical elements, wherein the notification window is displayed to include one or more options to control operation of the device, and detect a touch command relative to the display, wherein the touch command is a directional command, and wherein a direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window.
- the controller is also configured to control operation of the device based on the direction of the touch command.
- FIG. 1A depicts a graphical representation of device control according to one or more embodiments
- FIG. 1B depicts a graphical representation of device control according to one or more other embodiments
- FIG. 2 depicts a process for device control according to one or more embodiments
- FIG. 3 depicts a simplified diagram of a device according to one or more embodiments
- FIGS. 4A-4C depict graphical representations of device control according to one or more embodiments
- FIG. 5 depicts a graphical representation of device control according to one or more embodiments.
- FIG. 6 depicts a graphical representation of device operation according to one or more embodiments.
- One aspect of the disclosure is directed to improving control of a device.
- a process is provided for presenting a user interface and providing a command configuration for interaction with displayed elements based on directional touch commands.
- a control mechanism is provided for interaction across an operating system platform and applications of the device. In that fashion, a global control input is provided for a device to allow for a command to be executed across applications and the device operating system.
- touch commands or swipe dialogs are provided for interaction with displayed notifications, such as notification windows, messages, alerts, etc. Control features discussed herein allow for commands directed to notifications to be “swipeable” to allow for single hand use.
- swipe dialogs relate to touch commands with one or more pre-assigned directions.
- the touch command directions are pre-assigned to options of a notification message.
- touch commands are configured to provide swipe dialogs for Yes/No instances.
- touch commands are configured to provide interaction with a series of notification messages and/or multiple options.
- Swipe dialog features can also provide a swipe system that does not require selection of the desired action displayed and can provide an additional and/or alternative configuration for closing notifications.
- the terms “a” or “an” shall mean one or more than one.
- the term “plurality” shall mean two or more than two.
- the term “another” is defined as a second or more.
- the terms “including” and/or “having” are open ended (e.g., comprising).
- the term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
- FIGS. 1A-1B depict graphical representations of device control according to one or more embodiments.
- a device is configured to detect a touch command and control operation of the device associated with a displayed notification window.
- the device can provide functionality that allows for handling of the notification in one or more ways based on interactions with a display of the device.
- touch commands way be associated with the displayed notification.
- touch commands are relative to a display area different from the display position of the notification.
- FIG. 1A depicts a graphical representation of device control for device 100 .
- Device 100 includes display 105 , which is configured for touch operation.
- device 100 presents notification 110 .
- Notification 110 may be a pop up window, notification window, or display message presented by an operating system and/or application of device 100 .
- notification 110 can include one or more options or selections to direct operation.
- notification 110 can include selectable elements to approve an operation and/or canceling an operation.
- FIG. 1A also depicts a touch command 115 input by user 120 .
- Touch command 115 is shown for illustration, the position of the directional arrow for touch command 115 represents an exemplary direction of a command, the position of which can be located at one or more positions on display 105 .
- device 100 may be configured to detect touch command 115 when the command is overlapping the display area of notification 110 . In other embodiments, device 100 may be configured to detect touch command 115 when the command is non-overlapping the display area of notification 110 . Based on detection of touch command 115 and the direction of touch command 115 , device 100 controls operation. As such, device 100 provides swipe dialogs for device control.
- FIG. 1B depicts a graphical representation of device 100 following the detection of touch command 115 according to one or more other embodiments.
- device 100 updated display 105 based on correlation of the touch command with options available in notification 110 .
- a left swipe such as touch command 115 can be correlated to canceling (e.g., not proceeding) the function presented by notification 110 .
- updating the presentation of display 105 following touch command 115 may include presenting a graphical representation 125 of the correlated command.
- display 105 may be controlled to transmission the presentation of notification 110 off of the screen such as in slide off screen associated with the direction of touch command 115 and/or fade out from display.
- swipe dialogs 130 relate to pre-assigned directions such that a touch command associated with a pre-assigned direction is associated with control command.
- swipe dialogs 130 relate to horizontal swipe commands include a left swipe 135 to indicate cancelation of the notification action and a right swipe 140 to proceed with the operation of a notification, or vice versa.
- touch command 115 may be associated with single actions or multi-stage actions.
- device 100 may be configured to employ touch command 115 as a single action, such that a left swipe (e.g., cancel) relates to one action and a right swipe relates to another action (e.g., proceed/approve).
- device 100 may be configured to employ touch command 115 as a multi-stage action, such that a left swipe relates to a secondary action and a right swipe relates to a primary action.
- FIG. 2 depicts a process for device control according to one or more embodiments.
- process 200 is executed by device (e.g., device 100 , etc.) to control operation based on the direction of a touch command.
- Process 200 includes displaying a user interface including one or more graphical elements on a display of the device at block 205 .
- the user interface includes presentation of at least one of an application and device control screen.
- the user interface can relate to a device operating system interface such as a mobile operating system user interface.
- the user interface at block 205 may relate to a graphical interface for one or more applications.
- process 200 includes presenting a notification window in the user interface by the device as an overlay to the one or more graphical elements.
- the notification window is displayed to include one or more options to control operation of the device, such as a confirm selection and a cancel selection.
- a touch command is detected.
- the touch command may be relative to the display, such as a directional command.
- the direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window.
- the touch command is associated with position of display for the notification window. For example, a user may contact the display in the display area associated with the notification window, and perform a swipe in one or more pre-assigned directions (e.g., left, right, up, diagonal, etc.).
- the touch command is non-overlapping to the position of display for the notification window.
- the input command may be detected in a display area within reach of the user to allow for single hand use.
- the touch command can be associated with a two opposite directions, wherein the detected touch command relates to first of two opposing directions for device control.
- controlling operation of the device includes closing the display window from presentation when a swipe command is detected in a first direction. In another embodiment, controlling operation of the device includes selecting an option to proceed with an operation indicated in the display window for swipe command detected in a direction assigned to confirm.
- Control of operation at block 220 can also include updating the display to transition the notification window out from the user interface and presenting a graphical representation of the selected command during the updating.
- Process 200 may optionally include presenting a second level message at block 225 .
- some notifications can be associated with a follow-up notification window based on the direction of the touch command to allow for the swipe dialogs to provide a second layer of control following the initial presentation of the notification message.
- FIG. 3 depicts a simplified diagram of a device according to one or more embodiments.
- Device 300 may relate to one or more of a media player, personal communication device, tablet, and electronic device having a touch screen in general.
- device 300 is a standalone device.
- device 300 is a computing device (e.g., computer, media player, etc.) configured to interoperate with another device.
- device 300 includes controller 305 , memory 310 , optional communications unit 315 and user interface 320 .
- Controller 305 may be configured to execute code stored in memory 310 for operation of device 300 including providing directional touch control and/or swipe dialog control.
- controller 305 is configured to control display of a user interface, present a notification window in the user interface, detect a touch command relative to a display of user interface 320 and control operation based on the direction of the touch command.
- controller 305 includes a processor and/or one or more processing elements.
- controller 305 includes one or more of hardware, software, firmware and/or processing components in general.
- controller 305 is configured to perform one or more processes described herein.
- Optional communications unit 315 is configured for wired and/or wireless communication with one or more network elements, such as servers.
- Memory 310 can include non-transitory RAM and/or ROM memory for storing executable instructions, operating instructions and content for display.
- User interface 320 can include one or more input/output interfaces for control and/or communication.
- device 300 relates to a device including a display as part of user interface 320 .
- FIGS. 4A-4C depict graphical representations of device control according to one or more embodiments.
- device 400 is depicted including display 405 and notification 410 .
- notification 410 includes one or more options 415 for proceeding with respect to the message of the notification (e.g., proceed or cancel).
- FIGS. 4B and 4C depict exemplary non-overlapping touch commands which are detected by device 400 .
- touch command 420 relates to a left swipe to display 405 .
- device 400 detects touch command 420 and correlates the command with the “No” option 425 of options 415 .
- device 400 may highlight and/or update the presentation of option 425 to indicate the selection made.
- Device 400 may additionally transition the notification display to fade out, slide out of view and/or close out from display 405 .
- touch command 430 relates to a left swipe to display 405 .
- device 400 detects touch command 430 and correlates the command with the “YES” option 435 of options 415 .
- device 400 may highlight and/or update the presentation of option 435 to indicate the selection made.
- Device 400 may additionally transition the notification display to fade out, slide out of view and/or close out from display 405 .
- Touch commands such as touch command 420 and touch command 430 provide a universal method of closing a dialog notification presented by a device.
- touch command as used herein is not limited to the notification display area, and the touch command can be applied anywhere on display 405 . uses the full screen
- FIG. 5 depicts a graphical representation of device control according to one or more embodiments.
- detection of a touch command during presentation of a notification can prompt a device to present a secondary message.
- Device 500 includes display 505 configured to present notification 510 .
- notification 505 may transition off screen of display in direction 515 and device may present secondary message 520 .
- a secondary message 520 relates to a follow up notification. Secondary message may transition onto display 505 as shown by direction 525 .
- FIG. 5 also depicts transition of notification 510 away from the display window on release of touch command.
- FIG. 6 depicts a graphical representation of device operation according to one or more embodiments.
- notifications may be generated on a device by one or more of a core operating system and applications of the device.
- Device 600 is depicted including core operating system 605 , application 610 and application 615 .
- one more of core operating system 605 , application 610 and application 615 may output a notification for display by device 600 .
- touch commands may be detected and utilized by device 600 for control regardless of the user interface presented.
- application 610 may output a notification while the user interface for core operating system 605 is presented by the device.
- core operating system 605 may output a message for display while the user interface for application 610 is presented.
- device 600 can detect touch commands for control and correlate the touch command to operation. For example, a swipe in a direction pre-assigned to cancel a notification will result in device closing the notification as output 620 of device 600 . In certain embodiments, a swipe in a direction pre-assigned to approve or proceed with a notification will result in device approving the notification as output 620 of device 600 .
Abstract
Description
- This application claims priority to U.S. Provisional Application No. 62/183,613 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 23, 2015, and U.S. Provisional Application No. 62/184,476 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 25, 2015, the content of which is expressly incorporated by reference in its entirety.
- The present disclosure relates to electronic devices, and more particularly to device control.
- Mobile devices and personal communication devices are generally used for multiple purposes. These devices are often configured with particular forms of control, such as the inclusion of hard and soft buttons. With development of applications and device capabilities, there exists a need for device configurations that improve performance and resolve drawbacks of the conventional configurations. One area where improvements are needed is for device control configurations.
- Regarding conventional methods, devices may require selection of a particular element. In some instances, the display element is difficult to select based on the size of a device, the display location of the element, a requirement for scrolling to the element, or even a requirement to select a button of the device. In other instances, elements may be hard to select due to configuration of content not intended for display on a device, such as non-mobile network sites designed for computer viewing.
- Presentation of selection prompts is common. However, conventional methods of device control relative to the selection prompt is limited. With some configurations, it may be difficult to close or dismiss the selection prompt. There is a desire to provide additional control functionality for devices, and for controllability with respect to presented selection prompts.
- Disclosed and claimed herein are methods for device control and device configurations. One embodiment is directed to a method for device control includes displaying, by a device, a user interface including one or more graphical elements on a display of the device, presenting, by the device, a notification window in the user interface as an overlay to the one or more graphical elements, wherein the notification window is displayed to include one or more options to control operation of the device, and detecting, by the device, a touch command relative to the display, wherein the touch command is a directional command, and wherein a direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window. The method also includes controlling, by the device, operation of the device based on the direction of the touch command.
- In one embodiment, the user interface includes presentation of at least one of an application and device control screen.
- In one embodiment, the notification window includes a confirm selection and a cancel selection.
- In one embodiment, the touch command is associated with position of display for the notification window.
- In one embodiment, the touch command is non-overlapping to the position of display for the notification window.
- In one embodiment, the touch command is a first of two opposing directions for device control.
- In one embodiment, controlling operation of the device includes closing the display window from presentation when a swipe command is detected in a first direction.
- In one embodiment, controlling operation of the device includes selecting an option to proceed with an operation indicated in the display window for swipe command detected in a direction assigned to confirm.
- In one embodiment, the method also includes presenting a follow-up notification window based on the direction of the touch command.
- In one embodiment, the method includes updating the display to transition the notification window out from the user interface and presenting a graphical representation of the selected command during the updating.
- Another embodiment is directed to a device including a display and a controller coupled to the display. The controller is configured to control display of a user interface including one or more graphical elements on the display, present a notification window in the user interface as an overlay to the one or more graphical elements, wherein the notification window is displayed to include one or more options to control operation of the device, and detect a touch command relative to the display, wherein the touch command is a directional command, and wherein a direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window. The controller is also configured to control operation of the device based on the direction of the touch command.
- Other aspects, features, and techniques will be apparent to one skilled in the relevant art in view of the following detailed description of the embodiments.
- The features, objects, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:
-
FIG. 1A depicts a graphical representation of device control according to one or more embodiments; -
FIG. 1B depicts a graphical representation of device control according to one or more other embodiments; -
FIG. 2 depicts a process for device control according to one or more embodiments; -
FIG. 3 depicts a simplified diagram of a device according to one or more embodiments; -
FIGS. 4A-4C depict graphical representations of device control according to one or more embodiments; -
FIG. 5 depicts a graphical representation of device control according to one or more embodiments; and -
FIG. 6 depicts a graphical representation of device operation according to one or more embodiments. - One aspect of the disclosure is directed to improving control of a device. In one embodiment, a process is provided for presenting a user interface and providing a command configuration for interaction with displayed elements based on directional touch commands. In addition, a control mechanism is provided for interaction across an operating system platform and applications of the device. In that fashion, a global control input is provided for a device to allow for a command to be executed across applications and the device operating system. In contrast to limiting display element interaction to require selection of a particular item, touch commands or swipe dialogs are provided for interaction with displayed notifications, such as notification windows, messages, alerts, etc. Control features discussed herein allow for commands directed to notifications to be “swipeable” to allow for single hand use.
- Methods and device configurations are provided for touch command device control. In one embodiment, swipe dialogs relate to touch commands with one or more pre-assigned directions. The touch command directions are pre-assigned to options of a notification message. In one embodiment, touch commands are configured to provide swipe dialogs for Yes/No instances. According to another embodiment, touch commands are configured to provide interaction with a series of notification messages and/or multiple options. Swipe dialog features can also provide a swipe system that does not require selection of the desired action displayed and can provide an additional and/or alternative configuration for closing notifications.
- As used herein, the terms “a” or “an” shall mean one or more than one. The term “plurality” shall mean two or more than two. The term “another” is defined as a second or more. The terms “including” and/or “having” are open ended (e.g., comprising). The term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
- Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.
- Referring now to the figures,
FIGS. 1A-1B depict graphical representations of device control according to one or more embodiments. According to one embodiment, a device is configured to detect a touch command and control operation of the device associated with a displayed notification window. Thus, in contrast to requiring a section of the display window, the device can provide functionality that allows for handling of the notification in one or more ways based on interactions with a display of the device. As will be discussed herein, touch commands way be associated with the displayed notification. In other embodiments, touch commands are relative to a display area different from the display position of the notification. - In
FIG. 1A depicts a graphical representation of device control fordevice 100.Device 100 includesdisplay 105, which is configured for touch operation. According to one embodiment,device 100presents notification 110.Notification 110 may be a pop up window, notification window, or display message presented by an operating system and/or application ofdevice 100. In certain embodiments,notification 110 can include one or more options or selections to direct operation. By way of example,notification 110 can include selectable elements to approve an operation and/or canceling an operation.FIG. 1A also depicts atouch command 115 input byuser 120.Touch command 115 is shown for illustration, the position of the directional arrow fortouch command 115 represents an exemplary direction of a command, the position of which can be located at one or more positions ondisplay 105. In that fashion,touch command 115 does not need tooverlay notification 110 or be non-overlapping tonotification 110. In certain embodiments,device 100 may be configured to detecttouch command 115 when the command is overlapping the display area ofnotification 110. In other embodiments,device 100 may be configured to detecttouch command 115 when the command is non-overlapping the display area ofnotification 110. Based on detection oftouch command 115 and the direction oftouch command 115,device 100 controls operation. As such,device 100 provides swipe dialogs for device control. -
FIG. 1B depicts a graphical representation ofdevice 100 following the detection oftouch command 115 according to one or more other embodiments. According to one embodiment, followingtouch command 115,device 100 updateddisplay 105 based on correlation of the touch command with options available innotification 110. By way of example, and in one embodiment, a left swipe, such astouch command 115 can be correlated to canceling (e.g., not proceeding) the function presented bynotification 110. According to one embodiment, updating the presentation ofdisplay 105 followingtouch command 115 may include presenting agraphical representation 125 of the correlated command. According to another embodiment,display 105 may be controlled to transmission the presentation ofnotification 110 off of the screen such as in slide off screen associated with the direction oftouch command 115 and/or fade out from display. - According to one embodiment, each notification, presented by
device 100 may be interacted with byswipe dialogs 130. According to one embodiment, swipe dialogs relate to pre-assigned directions such that a touch command associated with a pre-assigned direction is associated with control command. In one embodiment, swipe dialogs 130 relate to horizontal swipe commands include aleft swipe 135 to indicate cancelation of the notification action and aright swipe 140 to proceed with the operation of a notification, or vice versa. - According to another embodiment,
touch command 115 may be associated with single actions or multi-stage actions. In one embodiment,device 100 may be configured to employtouch command 115 as a single action, such that a left swipe (e.g., cancel) relates to one action and a right swipe relates to another action (e.g., proceed/approve). In another embodiment,device 100 may be configured to employtouch command 115 as a multi-stage action, such that a left swipe relates to a secondary action and a right swipe relates to a primary action. -
FIG. 2 depicts a process for device control according to one or more embodiments. According to one embodiment,process 200 is executed by device (e.g.,device 100, etc.) to control operation based on the direction of a touch command.Process 200 includes displaying a user interface including one or more graphical elements on a display of the device atblock 205. In one embodiment, the user interface includes presentation of at least one of an application and device control screen. The user interface can relate to a device operating system interface such as a mobile operating system user interface. The user interface atblock 205 may relate to a graphical interface for one or more applications. - At
block 210,process 200 includes presenting a notification window in the user interface by the device as an overlay to the one or more graphical elements. The notification window is displayed to include one or more options to control operation of the device, such as a confirm selection and a cancel selection. - At
block 215, a touch command is detected. The touch command may be relative to the display, such as a directional command. The direction of the touch command is correlated by the device to one of a plurality of options to control device operation in response to the notification window. In one embodiment, the touch command is associated with position of display for the notification window. For example, a user may contact the display in the display area associated with the notification window, and perform a swipe in one or more pre-assigned directions (e.g., left, right, up, diagonal, etc.). In other embodiments, the touch command is non-overlapping to the position of display for the notification window. Thus, when a device displays the notification area in an area that requires two hand use, the input command may be detected in a display area within reach of the user to allow for single hand use. The touch command can be associated with a two opposite directions, wherein the detected touch command relates to first of two opposing directions for device control. - At
block 220 the device controls operation based on the direction of the touch command. In one embodiment, controlling operation of the device includes closing the display window from presentation when a swipe command is detected in a first direction. In another embodiment, controlling operation of the device includes selecting an option to proceed with an operation indicated in the display window for swipe command detected in a direction assigned to confirm. - Control of operation at
block 220 can also include updating the display to transition the notification window out from the user interface and presenting a graphical representation of the selected command during the updating. -
Process 200 may optionally include presenting a second level message atblock 225. For example, some notifications can be associated with a follow-up notification window based on the direction of the touch command to allow for the swipe dialogs to provide a second layer of control following the initial presentation of the notification message. -
FIG. 3 depicts a simplified diagram of a device according to one or more embodiments.Device 300 may relate to one or more of a media player, personal communication device, tablet, and electronic device having a touch screen in general. In certain embodiments,device 300 is a standalone device. In other embodiments,device 300 is a computing device (e.g., computer, media player, etc.) configured to interoperate with another device. - As shown in
FIG. 3 ,device 300 includescontroller 305,memory 310,optional communications unit 315 anduser interface 320.Controller 305 may be configured to execute code stored inmemory 310 for operation ofdevice 300 including providing directional touch control and/or swipe dialog control. In an exemplary embodiment,controller 305 is configured to control display of a user interface, present a notification window in the user interface, detect a touch command relative to a display ofuser interface 320 and control operation based on the direction of the touch command. - According to one embodiment,
controller 305 includes a processor and/or one or more processing elements. In one embodiment,controller 305 includes one or more of hardware, software, firmware and/or processing components in general. According to one embodiment,controller 305 is configured to perform one or more processes described herein.Optional communications unit 315 is configured for wired and/or wireless communication with one or more network elements, such as servers.Memory 310 can include non-transitory RAM and/or ROM memory for storing executable instructions, operating instructions and content for display.User interface 320 can include one or more input/output interfaces for control and/or communication. In certain embodiments,device 300 relates to a device including a display as part ofuser interface 320. -
FIGS. 4A-4C depict graphical representations of device control according to one or more embodiments. InFIG. 4A ,device 400 is depicted includingdisplay 405 andnotification 410. According to one embodiment,notification 410 includes one ormore options 415 for proceeding with respect to the message of the notification (e.g., proceed or cancel).FIGS. 4B and 4C depict exemplary non-overlapping touch commands which are detected bydevice 400. - In
FIG. 4B ,touch command 420 relates to a left swipe to display 405. According to one embodiment,device 400 detectstouch command 420 and correlates the command with the “No”option 425 ofoptions 415. In certain embodiments,device 400 may highlight and/or update the presentation ofoption 425 to indicate the selection made.Device 400 may additionally transition the notification display to fade out, slide out of view and/or close out fromdisplay 405. - In
FIG. 4C ,touch command 430 relates to a left swipe to display 405. According to one embodiment,device 400 detectstouch command 430 and correlates the command with the “YES”option 435 ofoptions 415. In certain embodiments,device 400 may highlight and/or update the presentation ofoption 435 to indicate the selection made.Device 400 may additionally transition the notification display to fade out, slide out of view and/or close out fromdisplay 405. - Touch commands, such as
touch command 420 andtouch command 430 provide a universal method of closing a dialog notification presented by a device. As can be seen inFIGS. 4A-4C , the touch command as used herein is not limited to the notification display area, and the touch command can be applied anywhere ondisplay 405. uses the full screen -
FIG. 5 depicts a graphical representation of device control according to one or more embodiments. According to one embodiment, detection of a touch command during presentation of a notification can prompt a device to present a secondary message.Device 500 includesdisplay 505 configured to presentnotification 510. Based on a touch command,notification 505 may transition off screen of display indirection 515 and device may presentsecondary message 520. According to one embodiment, asecondary message 520 relates to a follow up notification. Secondary message may transition ontodisplay 505 as shown bydirection 525.FIG. 5 also depicts transition ofnotification 510 away from the display window on release of touch command. -
FIG. 6 depicts a graphical representation of device operation according to one or more embodiments. According to one embodiment, notifications may be generated on a device by one or more of a core operating system and applications of the device.Device 600 is depicted includingcore operating system 605,application 610 andapplication 615. According to one embodiment, one more ofcore operating system 605,application 610 andapplication 615 may output a notification for display bydevice 600. According to one embodiment, touch commands may be detected and utilized bydevice 600 for control regardless of the user interface presented. For example,application 610 may output a notification while the user interface forcore operating system 605 is presented by the device. Alternatively,core operating system 605 may output a message for display while the user interface forapplication 610 is presented. During presentation of the notification,device 600 can detect touch commands for control and correlate the touch command to operation. For example, a swipe in a direction pre-assigned to cancel a notification will result in device closing the notification asoutput 620 ofdevice 600. In certain embodiments, a swipe in a direction pre-assigned to approve or proceed with a notification will result in device approving the notification asoutput 620 ofdevice 600. - While this disclosure has been particularly shown and described with references to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the claimed embodiments.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/169,642 US20160378279A1 (en) | 2015-06-23 | 2016-05-31 | System and methods for device control |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562183613P | 2015-06-23 | 2015-06-23 | |
US201562184476P | 2015-06-25 | 2015-06-25 | |
US15/169,642 US20160378279A1 (en) | 2015-06-23 | 2016-05-31 | System and methods for device control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160378279A1 true US20160378279A1 (en) | 2016-12-29 |
Family
ID=57600988
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/053,501 Abandoned US20170024086A1 (en) | 2015-06-23 | 2016-02-25 | System and methods for detection and handling of focus elements |
US15/133,859 Active 2036-12-30 US10310706B2 (en) | 2015-06-23 | 2016-04-20 | System and methods for touch target presentation |
US15/133,870 Active 2036-12-06 US10241649B2 (en) | 2015-06-23 | 2016-04-20 | System and methods for application discovery and trial |
US15/133,846 Abandoned US20160381287A1 (en) | 2015-06-23 | 2016-04-20 | System and methods for controlling device operation and image capture |
US15/169,642 Abandoned US20160378279A1 (en) | 2015-06-23 | 2016-05-31 | System and methods for device control |
US15/169,634 Active 2037-01-10 US10331300B2 (en) | 2015-06-23 | 2016-05-31 | Device and methods for control including presentation of a list of selectable display elements |
US15/190,145 Abandoned US20160378281A1 (en) | 2015-06-23 | 2016-06-22 | System and methods for navigation bar presentation and device control |
US15/190,144 Active 2037-05-03 US10222947B2 (en) | 2015-06-23 | 2016-06-22 | Methods and devices for presenting dynamic information graphics |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/053,501 Abandoned US20170024086A1 (en) | 2015-06-23 | 2016-02-25 | System and methods for detection and handling of focus elements |
US15/133,859 Active 2036-12-30 US10310706B2 (en) | 2015-06-23 | 2016-04-20 | System and methods for touch target presentation |
US15/133,870 Active 2036-12-06 US10241649B2 (en) | 2015-06-23 | 2016-04-20 | System and methods for application discovery and trial |
US15/133,846 Abandoned US20160381287A1 (en) | 2015-06-23 | 2016-04-20 | System and methods for controlling device operation and image capture |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/169,634 Active 2037-01-10 US10331300B2 (en) | 2015-06-23 | 2016-05-31 | Device and methods for control including presentation of a list of selectable display elements |
US15/190,145 Abandoned US20160378281A1 (en) | 2015-06-23 | 2016-06-22 | System and methods for navigation bar presentation and device control |
US15/190,144 Active 2037-05-03 US10222947B2 (en) | 2015-06-23 | 2016-06-22 | Methods and devices for presenting dynamic information graphics |
Country Status (1)
Country | Link |
---|---|
US (8) | US20170024086A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109254719A (en) * | 2018-08-24 | 2019-01-22 | 维沃移动通信有限公司 | A kind of processing method and mobile terminal of display interface |
US11043206B2 (en) | 2017-05-18 | 2021-06-22 | Aiqudo, Inc. | Systems and methods for crowdsourced actions and commands |
US11056105B2 (en) | 2017-05-18 | 2021-07-06 | Aiqudo, Inc | Talk back from actions in applications |
US11340925B2 (en) | 2017-05-18 | 2022-05-24 | Peloton Interactive Inc. | Action recipes for a crowdsourced digital assistant system |
US11520610B2 (en) | 2017-05-18 | 2022-12-06 | Peloton Interactive Inc. | Crowdsourced on-boarding of digital assistant operations |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10155168B2 (en) | 2012-05-08 | 2018-12-18 | Snap Inc. | System and method for adaptable avatars |
USD738889S1 (en) * | 2013-06-09 | 2015-09-15 | Apple Inc. | Display screen or portion thereof with animated graphical user interface |
USD771112S1 (en) * | 2014-06-01 | 2016-11-08 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US10339365B2 (en) | 2016-03-31 | 2019-07-02 | Snap Inc. | Automated avatar generation |
US10261666B2 (en) * | 2016-05-31 | 2019-04-16 | Microsoft Technology Licensing, Llc | Context-independent navigation of electronic content |
US10432559B2 (en) | 2016-10-24 | 2019-10-01 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
CN106354418B (en) * | 2016-11-16 | 2019-07-09 | 腾讯科技(深圳)有限公司 | A kind of control method and device based on touch screen |
US10484675B2 (en) * | 2017-04-16 | 2019-11-19 | Facebook, Inc. | Systems and methods for presenting content |
US10212541B1 (en) | 2017-04-27 | 2019-02-19 | Snap Inc. | Selective location-based identity communication |
CN110945555A (en) | 2017-04-27 | 2020-03-31 | 斯纳普公司 | Region-level representation of user locations on a social media platform |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11397558B2 (en) * | 2017-05-18 | 2022-07-26 | Peloton Interactive, Inc. | Optimizing display engagement in action automation |
US10572107B1 (en) * | 2017-06-23 | 2020-02-25 | Amazon Technologies, Inc. | Voice communication targeting user interface |
US11379550B2 (en) * | 2017-08-29 | 2022-07-05 | Paypal, Inc. | Seamless service on third-party sites |
WO2019047189A1 (en) * | 2017-09-08 | 2019-03-14 | 广东欧珀移动通信有限公司 | Message display method and device and terminal |
WO2019047184A1 (en) * | 2017-09-08 | 2019-03-14 | 广东欧珀移动通信有限公司 | Information display method, apparatus, and terminal |
CN107547750B (en) * | 2017-09-11 | 2019-01-25 | Oppo广东移动通信有限公司 | Control method, device and the storage medium of terminal |
US11307760B2 (en) * | 2017-09-25 | 2022-04-19 | Huawei Technologies Co., Ltd. | Terminal interface display method and terminal |
US11416126B2 (en) * | 2017-12-20 | 2022-08-16 | Huawei Technologies Co., Ltd. | Control method and apparatus |
CN110442407B (en) * | 2018-05-03 | 2021-11-26 | 腾讯科技(深圳)有限公司 | Application program processing method and device |
US10936163B2 (en) * | 2018-07-17 | 2021-03-02 | Methodical Mind, Llc. | Graphical user interface system |
US11423073B2 (en) * | 2018-11-16 | 2022-08-23 | Microsoft Technology Licensing, Llc | System and management of semantic indicators during document presentations |
KR102657519B1 (en) * | 2019-02-08 | 2024-04-15 | 삼성전자주식회사 | Electronic device for providing graphic data based on voice and operating method thereof |
CN110213729B (en) * | 2019-05-30 | 2022-06-24 | 维沃移动通信有限公司 | Message sending method and terminal |
US11537269B2 (en) * | 2019-12-27 | 2022-12-27 | Methodical Mind, Llc. | Graphical user interface system |
CN112433661B (en) * | 2020-11-18 | 2022-02-11 | 上海幻电信息科技有限公司 | Interactive object selection method and device |
Family Cites Families (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5708709A (en) * | 1995-12-08 | 1998-01-13 | Sun Microsystems, Inc. | System and method for managing try-and-buy usage of application programs |
US7146381B1 (en) * | 1997-02-10 | 2006-12-05 | Actioneer, Inc. | Information organization and collaboration tool for processing notes and action requests in computer systems |
US5886698A (en) * | 1997-04-21 | 1999-03-23 | Sony Corporation | Method for filtering search results with a graphical squeegee |
US6839669B1 (en) * | 1998-11-05 | 2005-01-04 | Scansoft, Inc. | Performing actions identified in recognized speech |
US7434177B1 (en) * | 1999-12-20 | 2008-10-07 | Apple Inc. | User interface for providing consolidation and access |
US7730401B2 (en) * | 2001-05-16 | 2010-06-01 | Synaptics Incorporated | Touch screen with user interface enhancement |
AU2002359001A1 (en) * | 2001-12-28 | 2003-07-24 | Access Co., Ltd. | Usage period management system for applications |
US7787908B2 (en) * | 2002-03-19 | 2010-08-31 | Qualcomm Incorporated | Multi-call display management for wireless communication devices |
WO2004042515A2 (en) * | 2002-11-01 | 2004-05-21 | Pocketpurchase, Inc. | Method and system for online software purchases |
JP4215549B2 (en) * | 2003-04-02 | 2009-01-28 | 富士通株式会社 | Information processing device that operates in touch panel mode and pointing device mode |
KR101157016B1 (en) * | 2003-08-06 | 2012-06-21 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | A method of presenting a plurality of items |
US20050055309A1 (en) * | 2003-09-04 | 2005-03-10 | Dwango North America | Method and apparatus for a one click upgrade for mobile applications |
US8271495B1 (en) * | 2003-12-17 | 2012-09-18 | Topix Llc | System and method for automating categorization and aggregation of content from network sites |
US20060063590A1 (en) * | 2004-09-21 | 2006-03-23 | Paul Abassi | Mechanism to control game usage on user devices |
US8102973B2 (en) * | 2005-02-22 | 2012-01-24 | Raytheon Bbn Technologies Corp. | Systems and methods for presenting end to end calls and associated information |
US9727082B2 (en) * | 2005-04-26 | 2017-08-08 | Apple Inc. | Back-side interface for hand-held devices |
US7605804B2 (en) * | 2005-04-29 | 2009-10-20 | Microsoft Corporation | System and method for fine cursor positioning using a low resolution imaging touch screen |
US8818331B2 (en) * | 2005-04-29 | 2014-08-26 | Jasper Technologies, Inc. | Method for enabling a wireless device for geographically preferential services |
GB0522079D0 (en) * | 2005-10-29 | 2005-12-07 | Griffin Ian | Mobile game or program distribution |
EP1796000A1 (en) * | 2005-12-06 | 2007-06-13 | International Business Machines Corporation | Method, system and computer program for distributing software products in trial mode |
US7958456B2 (en) * | 2005-12-23 | 2011-06-07 | Apple Inc. | Scrolling list with floating adjacent index symbols |
US20070233782A1 (en) * | 2006-03-28 | 2007-10-04 | Silentclick, Inc. | Method & system for acquiring, storing, & managing software applications via a communications network |
US9395905B2 (en) * | 2006-04-05 | 2016-07-19 | Synaptics Incorporated | Graphical scroll wheel |
WO2007138423A2 (en) * | 2006-05-25 | 2007-12-06 | Shuki Binyamin | Method and system for providing remote access to applications |
US8611521B2 (en) * | 2006-07-07 | 2013-12-17 | Verizon Services Corp. | Systems and methods for multi-media control of audio conferencing |
CN101568894B (en) * | 2006-10-23 | 2012-07-18 | 吴谊镇 | Input device |
US7961860B1 (en) * | 2006-11-22 | 2011-06-14 | Securus Technologies, Inc. | Systems and methods for graphically displaying and analyzing call treatment operations |
US20090037287A1 (en) * | 2007-07-31 | 2009-02-05 | Ahmad Baitalmal | Software Marketplace and Distribution System |
US11126321B2 (en) * | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
KR20100133945A (en) * | 2007-11-05 | 2010-12-22 | 비스토 코포레이션 | Service management system for providing service related message prioritization in a mobile client |
WO2009075602A1 (en) * | 2007-12-13 | 2009-06-18 | Motorola, Inc. | Scenarios creation system for a mobile device |
JPWO2010032354A1 (en) * | 2008-09-22 | 2012-02-02 | 日本電気株式会社 | Image object control system, image object control method and program |
US8650290B2 (en) * | 2008-12-19 | 2014-02-11 | Openpeak Inc. | Portable computing device and method of operation of same |
US8370762B2 (en) * | 2009-04-10 | 2013-02-05 | Cellco Partnership | Mobile functional icon use in operational area in touch panel devices |
US20100280892A1 (en) * | 2009-04-30 | 2010-11-04 | Alcatel-Lucent Usa Inc. | Method and system for targeted offers to mobile users |
US20100277422A1 (en) * | 2009-04-30 | 2010-11-04 | Microsoft Corporation | Touchpad display |
US8346847B2 (en) * | 2009-06-03 | 2013-01-01 | Apple Inc. | Installing applications based on a seed application from a separate device |
US8448136B2 (en) * | 2009-06-25 | 2013-05-21 | Intuit Inc. | Creating a composite program module in a computing ecosystem |
US20110087975A1 (en) * | 2009-10-13 | 2011-04-14 | Sony Ericsson Mobile Communications Ab | Method and arrangement in a data |
US8370142B2 (en) * | 2009-10-30 | 2013-02-05 | Zipdx, Llc | Real-time transcription of conference calls |
US20110202864A1 (en) * | 2010-02-15 | 2011-08-18 | Hirsch Michael B | Apparatus and methods of receiving and acting on user-entered information |
US9912721B2 (en) * | 2010-05-14 | 2018-03-06 | Highlight Broadcast Network, Llc | Systems and methods for providing event-related video sharing services |
US20110295708A1 (en) * | 2010-05-25 | 2011-12-01 | beonSoft Inc. | Systems and methods for providing software rental services to devices connected to a network |
US8650558B2 (en) * | 2010-05-27 | 2014-02-11 | Rightware, Inc. | Online marketplace for pre-installed software and online services |
US20110307354A1 (en) * | 2010-06-09 | 2011-12-15 | Bilgehan Erman | Method and apparatus for recommending applications to mobile users |
US9864501B2 (en) * | 2010-07-30 | 2018-01-09 | Apaar Tuli | Displaying information |
US9936333B2 (en) * | 2010-08-10 | 2018-04-03 | Microsoft Technology Licensing, Llc | Location and contextual-based mobile application promotion and delivery |
US8615772B2 (en) * | 2010-09-28 | 2013-12-24 | Qualcomm Incorporated | Apparatus and methods of extending application services |
CN103348307B (en) * | 2010-12-08 | 2018-09-28 | 诺基亚技术有限公司 | User interface |
EP2469425A3 (en) * | 2010-12-21 | 2012-11-14 | Research In Motion Limited | Contextual customization of content display on a communication device |
US8612874B2 (en) * | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US20120209586A1 (en) * | 2011-02-16 | 2012-08-16 | Salesforce.Com, Inc. | Contextual Demonstration of Applications Hosted on Multi-Tenant Database Systems |
US20120246588A1 (en) * | 2011-03-21 | 2012-09-27 | Viacom International, Inc. | Cross marketing tool |
JP2012212230A (en) * | 2011-03-30 | 2012-11-01 | Toshiba Corp | Electronic apparatus |
US8656315B2 (en) * | 2011-05-27 | 2014-02-18 | Google Inc. | Moving a graphical selector |
US8826190B2 (en) * | 2011-05-27 | 2014-09-02 | Google Inc. | Moving a graphical selector |
GB201109339D0 (en) * | 2011-06-03 | 2011-07-20 | Firestorm Lab Ltd | Computing device interface |
US9053750B2 (en) * | 2011-06-17 | 2015-06-09 | At&T Intellectual Property I, L.P. | Speaker association with a visual representation of spoken content |
US8577737B1 (en) * | 2011-06-20 | 2013-11-05 | A9.Com, Inc. | Method, medium, and system for application lending |
US20130016129A1 (en) * | 2011-07-14 | 2013-01-17 | Google Inc. | Region-Specific User Input |
JP5295328B2 (en) * | 2011-07-29 | 2013-09-18 | Kddi株式会社 | User interface device capable of input by screen pad, input processing method and program |
DE102011118367B4 (en) * | 2011-08-24 | 2017-02-09 | Deutsche Telekom Ag | Method for authenticating a telecommunication terminal comprising an identity module at a server device of a telecommunication network, use of an identity module, identity module and computer program |
JP2013073330A (en) * | 2011-09-27 | 2013-04-22 | Nec Casio Mobile Communications Ltd | Portable electronic apparatus, touch area setting method and program |
US8713560B2 (en) * | 2011-12-22 | 2014-04-29 | Sap Ag | Compatibility check |
TWI470475B (en) * | 2012-04-17 | 2015-01-21 | Pixart Imaging Inc | Electronic system |
CN102707882A (en) * | 2012-04-27 | 2012-10-03 | 深圳瑞高信息技术有限公司 | Method for converting control modes of application program of touch screen with virtual icons and touch screen terminal |
US20130326499A1 (en) * | 2012-05-31 | 2013-12-05 | Microsoft Corporation | Automatically installing and removing recommended applications |
JP6071107B2 (en) * | 2012-06-14 | 2017-02-01 | 裕行 池田 | Mobile device |
KR20140016454A (en) * | 2012-07-30 | 2014-02-10 | 삼성전자주식회사 | Method and apparatus for controlling drag for moving object of mobile terminal comprising touch screen |
US9280789B2 (en) * | 2012-08-17 | 2016-03-08 | Google Inc. | Recommending native applications |
JP2014048936A (en) * | 2012-08-31 | 2014-03-17 | Omron Corp | Gesture recognition device, control method thereof, display equipment, and control program |
KR20140033839A (en) * | 2012-09-11 | 2014-03-19 | 삼성전자주식회사 | Method??for user's??interface using one hand in terminal having touchscreen and device thereof |
WO2014041732A1 (en) * | 2012-09-13 | 2014-03-20 | パナソニック株式会社 | Portable electronic device |
US20140109016A1 (en) * | 2012-10-16 | 2014-04-17 | Yu Ouyang | Gesture-based cursor control |
US20140184503A1 (en) * | 2013-01-02 | 2014-07-03 | Samsung Display Co., Ltd. | Terminal and method for operating the same |
US20140278860A1 (en) * | 2013-03-15 | 2014-09-18 | Samsung Electronics Co., Ltd. | Content delivery system with content sharing mechanism and method of operation thereof |
US9477404B2 (en) * | 2013-03-15 | 2016-10-25 | Apple Inc. | Device, method, and graphical user interface for managing concurrently open software applications |
US20140330647A1 (en) * | 2013-05-03 | 2014-11-06 | International Business Machines Corporation | Application and service selection for optimized promotion |
US20140344041A1 (en) * | 2013-05-20 | 2014-11-20 | Cellco Partnership D/B/A Verizon Wireless | Triggered mobile checkout application |
US8786569B1 (en) * | 2013-06-04 | 2014-07-22 | Morton Silverberg | Intermediate cursor touchscreen protocols |
KR102136602B1 (en) * | 2013-07-10 | 2020-07-22 | 삼성전자 주식회사 | Apparatus and method for processing a content in mobile device |
US9098366B1 (en) * | 2013-07-11 | 2015-08-04 | Sprint Communications Company L.P. | Virtual pre-installation of applications |
US10698930B2 (en) * | 2013-08-22 | 2020-06-30 | Sensoriant, Inc. | Assignment of application (apps) and relevant services to specific locations, dates and times |
KR102009279B1 (en) * | 2013-09-13 | 2019-08-09 | 엘지전자 주식회사 | Mobile terminal |
CN114895839A (en) * | 2014-01-06 | 2022-08-12 | 华为终端有限公司 | Application program display method and terminal |
CN104793774A (en) * | 2014-01-20 | 2015-07-22 | 联发科技(新加坡)私人有限公司 | Electronic device control method |
KR102105961B1 (en) * | 2014-05-13 | 2020-05-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US20150331589A1 (en) * | 2014-05-15 | 2015-11-19 | Todd KAWAKITA | Circular interface for navigating applications and an authentication mechanism on a wearable device |
JP6328797B2 (en) * | 2014-05-30 | 2018-05-23 | アップル インコーポレイテッド | Transition from using one device to using another device |
US9479412B2 (en) * | 2014-06-27 | 2016-10-25 | Agora Lab, Inc. | Systems and methods for improved quality of a visualized call over network through pathway testing |
KR20160026141A (en) * | 2014-08-29 | 2016-03-09 | 삼성전자주식회사 | Controlling Method based on a communication status and Electronic device supporting the same |
US20170220782A1 (en) * | 2014-09-08 | 2017-08-03 | Ali ALSANOUSI | Mobile interface platform systems and methods |
US10176306B2 (en) * | 2014-12-16 | 2019-01-08 | JVC Kenwood Corporation | Information processing apparatus, evaluation method, and storage medium for evaluating application program |
US10169474B2 (en) * | 2015-06-11 | 2019-01-01 | International Business Machines Corporation | Mobile application discovery using an electronic map |
US10628559B2 (en) * | 2015-06-23 | 2020-04-21 | Microsoft Technology Licensing, Llc | Application management |
KR20170029329A (en) * | 2015-09-07 | 2017-03-15 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US20170337214A1 (en) * | 2016-05-18 | 2017-11-23 | Linkedin Corporation | Synchronizing nearline metrics with sources of truth |
-
2016
- 2016-02-25 US US15/053,501 patent/US20170024086A1/en not_active Abandoned
- 2016-04-20 US US15/133,859 patent/US10310706B2/en active Active
- 2016-04-20 US US15/133,870 patent/US10241649B2/en active Active
- 2016-04-20 US US15/133,846 patent/US20160381287A1/en not_active Abandoned
- 2016-05-31 US US15/169,642 patent/US20160378279A1/en not_active Abandoned
- 2016-05-31 US US15/169,634 patent/US10331300B2/en active Active
- 2016-06-22 US US15/190,145 patent/US20160378281A1/en not_active Abandoned
- 2016-06-22 US US15/190,144 patent/US10222947B2/en active Active
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11043206B2 (en) | 2017-05-18 | 2021-06-22 | Aiqudo, Inc. | Systems and methods for crowdsourced actions and commands |
US11056105B2 (en) | 2017-05-18 | 2021-07-06 | Aiqudo, Inc | Talk back from actions in applications |
US11340925B2 (en) | 2017-05-18 | 2022-05-24 | Peloton Interactive Inc. | Action recipes for a crowdsourced digital assistant system |
US11520610B2 (en) | 2017-05-18 | 2022-12-06 | Peloton Interactive Inc. | Crowdsourced on-boarding of digital assistant operations |
US11682380B2 (en) | 2017-05-18 | 2023-06-20 | Peloton Interactive Inc. | Systems and methods for crowdsourced actions and commands |
US11862156B2 (en) | 2017-05-18 | 2024-01-02 | Peloton Interactive, Inc. | Talk back from actions in applications |
CN109254719A (en) * | 2018-08-24 | 2019-01-22 | 维沃移动通信有限公司 | A kind of processing method and mobile terminal of display interface |
Also Published As
Publication number | Publication date |
---|---|
US20160381287A1 (en) | 2016-12-29 |
US20170024086A1 (en) | 2017-01-26 |
US10310706B2 (en) | 2019-06-04 |
US20160378321A1 (en) | 2016-12-29 |
US20160378281A1 (en) | 2016-12-29 |
US20160379395A1 (en) | 2016-12-29 |
US10331300B2 (en) | 2019-06-25 |
US20160378293A1 (en) | 2016-12-29 |
US20160378278A1 (en) | 2016-12-29 |
US10241649B2 (en) | 2019-03-26 |
US10222947B2 (en) | 2019-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160378279A1 (en) | System and methods for device control | |
US9261989B2 (en) | Interacting with radial menus for touchscreens | |
US8497842B2 (en) | System having user interface using motion based object selection and mouse movement | |
AU2014358019B2 (en) | Method of displaying pointing information and device for performing the method | |
US20130275901A1 (en) | Drag and drop operation in a graphical user interface with size alteration of the dragged object | |
EP3098191B1 (en) | System and method for initiating elevator service by entering an elevator call | |
US20140075388A1 (en) | Providing radial menus with touchscreens | |
KR102078753B1 (en) | Method for controlling layout and an electronic device thereof | |
USRE49272E1 (en) | Adaptive determination of information display | |
CN108604172A (en) | Multi-screen mobile device and operation | |
US20120060129A1 (en) | Mobile terminal having touch screen and method for displaying contents therein | |
EP3343340A1 (en) | Display device, television receiver, program, and recording medium | |
JP2015127870A (en) | Controller, control method, program, and electronic apparatus | |
EP2693324B9 (en) | Method and apparatus for controlling drag for a moving object of a mobile terminal having a touch screen | |
US9619120B1 (en) | Picture-in-picture for operating systems | |
JP6480888B2 (en) | File batch processing method and digital device for executing the program | |
WO2014034369A1 (en) | Display control device, thin-client system, display control method, and recording medium | |
US11910130B2 (en) | Media control device and system | |
JP6977710B2 (en) | Information processing equipment, information processing methods, and programs | |
JP6624767B1 (en) | Information processing system and information processing method | |
KR20110100121A (en) | Method and apparatus for inputting character in mobile terminal | |
US20170351416A1 (en) | Method for controlling an operating parameter of an acoustic apparatus | |
JP2016103171A (en) | Operation receiving system, method, and program | |
US20170131824A1 (en) | Information processing apparatus, information processing method, and information processing program | |
US9529498B2 (en) | Input processing apparatus and method using a user interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: JAMDEO CANADA LTD., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DOURADO, SAULO;REEL/FRAME:038860/0834 Effective date: 20160531 Owner name: HISENSE ELECTRIC CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DOURADO, SAULO;REEL/FRAME:038860/0834 Effective date: 20160531 Owner name: HISENSE INTERNATIONAL CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DOURADO, SAULO;REEL/FRAME:038860/0834 Effective date: 20160531 Owner name: HISENSE USA CORP., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DOURADO, SAULO;REEL/FRAME:038860/0834 Effective date: 20160531 |
|
AS | Assignment |
Owner name: QINGDAO HISENSE ELECTRONICS CO., LTD., CHINA Free format text: CHANGE OF NAME;ASSIGNOR:HISENSE ELECTRIC CO., LTD.;REEL/FRAME:045546/0277 Effective date: 20170822 |
|
AS | Assignment |
Owner name: QINGDAO HISENSE ELECTRONICS CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAMDEO CANADA LTD.;HISENSE USA CORP.;HISENSE INTERNATIONAL CO., LTD.;SIGNING DATES FROM 20181114 TO 20181220;REEL/FRAME:047923/0254 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |