US20160378281A1 - System and methods for navigation bar presentation and device control - Google Patents

System and methods for navigation bar presentation and device control Download PDF

Info

Publication number
US20160378281A1
US20160378281A1 US15/190,145 US201615190145A US2016378281A1 US 20160378281 A1 US20160378281 A1 US 20160378281A1 US 201615190145 A US201615190145 A US 201615190145A US 2016378281 A1 US2016378281 A1 US 2016378281A1
Authority
US
United States
Prior art keywords
navigation bar
swipe
command
display
input command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/190,145
Inventor
Sanjiv Sirpal
Alexander de Paz
Mohammed Selim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Electronics Co Ltd
Original Assignee
Jamdeo Canada Ltd
Hisense Electric Co Ltd
Hisense International Co Ltd
Hisense USA Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jamdeo Canada Ltd, Hisense Electric Co Ltd, Hisense International Co Ltd, Hisense USA Corp filed Critical Jamdeo Canada Ltd
Priority to US15/190,145 priority Critical patent/US20160378281A1/en
Publication of US20160378281A1 publication Critical patent/US20160378281A1/en
Assigned to HISENSE USA CORP., Jamdeo Canada Ltd., HISENSE ELECTRIC CO., LTD., Hisense International Co., Ltd. reassignment HISENSE USA CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DE PAZ, ALEXANDER, SIRPAL, SANJIV, SELIM, MOHAMMED
Assigned to Qingdao Hisense Electronics Co., Ltd. reassignment Qingdao Hisense Electronics Co., Ltd. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HISENSE ELECTRIC CO., LTD.
Assigned to Qingdao Hisense Electronics Co., Ltd. reassignment Qingdao Hisense Electronics Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Hisense International Co., Ltd., HISENSE USA CORP., Jamdeo Canada Ltd.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • G06F16/316Indexing structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/3833Hand-held transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/142Managing session states for stateless protocols; Signalling session states; State transitions; Keeping-state mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present disclosure relates to electronic devices, and more particularly to device control.
  • Mobile devices and personal communication devices are generally used for multiple purposes. These devices are often configured with particular forms of control, such as the inclusion of hard and soft buttons. With development of applications and device capabilities, there exists a need for device configurations that improve performance and resolve drawbacks of the conventional configurations. One area where improvements are needed is for device control configurations.
  • devices may require selection of a particular element to control a device.
  • the display element is difficult to select based on the size of a device, the display location of the element, a requirement for scrolling to the element, or even a requirement to select a button of the device.
  • elements may be hard to select due to configuration of content not intended for display on a device, such as non-mobile network sites designed for computer viewing.
  • One embodiment is directed to a method for device control that includes displaying, by a device, a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones.
  • the method also includes detecting, by the device, an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar.
  • the method also includes determining, by the device, a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command and updating presentation of the navigation bar based on the navigation bar command.
  • the navigation bar includes a graphical element associated with each of the plurality of swipe zones.
  • the navigation bar includes three swipe zones.
  • the input command is a touch command to the display, the touch command associated with a position of display for the navigation bar.
  • the navigation bar command expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements.
  • the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
  • updating presentation of the navigation bar includes expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar.
  • updating presentation of the navigation bar includes modifying the graphical elements display in the navigation bar based on the navigation bar command.
  • the method includes controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
  • the method includes updating presentation of the navigation bar based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
  • Another embodiment is directed to a device including a display and a controller coupled to the display.
  • the controller is configured to control display of a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones.
  • the controller is also configured to detect an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar.
  • the controller is also configured to determine a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command, and control an updated presentation of the navigation bar based on the navigation bar command.
  • FIG. 1A depicts a graphical representation of device control according to one or more embodiments
  • FIG. 1B depicts a graphical representation of device control according to one or more other embodiments
  • FIG. 2 depicts a process for device control according to one or more embodiments
  • FIG. 3 depicts a simplified diagram of a device according to one or more embodiments
  • FIGS. 4A-4D depict graphical representations of device control according to one or more embodiments.
  • FIGS. 5A-5C depict graphical representations of device control according to one or more embodiments.
  • a process for presenting a user interface and providing a navigation bar.
  • a navigation bar relates to one or more graphical elements presented on a display of a device, wherein elements of the navigation bar may be selected for control of the device and commands may be input to the device using the navigation bar.
  • the navigation bar may be presented with a plurality of presentation formats based on input commands, navigation commands and/or one or more customized settings for the navigation bar.
  • a navigation bar and processes for interaction are directed to navigation bars including a plurality or swipe areas/zones.
  • functionality of the navigation is improved.
  • providing a user interface with additional controls while minimizing control steps or screen clutter improves device control.
  • input commands may allow for updating presentation of a navigation bar to one or more presentation formats.
  • the terms “a” or “an” shall mean one or more than one.
  • the term “plurality” shall mean two or more than two.
  • the term “another” is defined as a second or more.
  • the terms “including” and/or “having” are open ended (e.g., comprising).
  • the term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
  • FIGS. 1A-1B depict graphical representations of device control according to one or more embodiments.
  • a device is configured to display a user interface including a navigation bar and allow for integration with the navigation bar to control device operation.
  • a device and methods described herein provide a plurality of control inputs that enriches control features without cluttering a user interface or requiring menu driven operation.
  • navigation bars and input commands e.g., touch commands
  • FIG. 1A depicts a graphical representation of device control for device 100 .
  • Device 100 includes display 105 , which is configured for touch operation.
  • device 100 presents navigation bar 110 .
  • navigation bar 110 is presented on display 105 as part of a user interface of device 100 .
  • Navigation bar 110 may be configured for display during a home screen/control panel view of the user interface.
  • navigation bar 110 is presented following user interaction, such as an on-screen or off screen swipe relative to display 105 .
  • navigation bar 110 is configured as a navigation drawer configured to allow for expansion of the display area.
  • navigation bar 110 is displayed including one or more selectable elements and includes a plurality of swipe zones.
  • navigation bar 110 includes element 120 which may be selected and toggled to allow navigation bar 110 to function as a navigation drawer.
  • navigation bar 110 includes graphical elements 115 and 125 to indicate one or more of a swipe zone and pre-assigned application accessible in navigation bar 110 .
  • navigation bar 110 includes a plurality of swipes areas. Swipe areas are zones or portions of the navigation bar that allow for interaction and added control. An exemplary representation of swipe zones is shown in FIG. 1A , wherein navigation bar is divided into swipe zones 116 , 121 and 126 .
  • an input command may be detected based on the location of the swipe command and a swipe zone.
  • swipe zone 116 is associated with graphical element 115
  • swipe zone 121 is associated with graphical element 120
  • swipe zone 126 is associated with graphical element 125 .
  • Navigation bar 110 is configured to include swipe zones 116 , 121 and 126 , to allow for quick access to a pre-assigned application, and added control for navigation bar 110 .
  • swipe zones 116 , 121 and 126 may be customized areas to reveal content or advanced features and increased functionality of device 100 .
  • FIG. 1B depicts a graphical representation of device 100 and input commands relative to display 105 and navigation bar 110 .
  • device 100 is configured to detect input command 130 relative to display 105 .
  • input command 130 is a swipe command relative to one of the plurality of swipe zones/areas 116 , 121 and 126 of navigation bar 110 .
  • device 100 determines a navigation bar command based on identification of a swipe zone/area of the input command 130 and a swipe length 135 of the input command 130 .
  • Exemplary swipe commands relative to swipe zones/areas 116 , 121 and 126 of navigation bar 110 are shown as 131 , 132 and 133 , respectively.
  • Swipe commands 131 , 132 and 133 are associated with customized areas to reveal content or advanced features and increased functionality.
  • Swipe command 131 provides the capability of swiping up from the center (R 0 ).
  • Swipe commands 132 and 133 provide the capability of swiping up sides or corners (R 1 , R 2 ).
  • navigation bar 110 defaults to a customized left zone 116 , and customized right zone 126 .
  • Zone 121 may be associated with a drawer function of device 100 .
  • FIG. 2 depicts a process for device control according to one or more embodiments.
  • process 200 is executed by device (e.g., device 100 , etc.) to control operation of the device with a navigation bar.
  • Process 200 includes displaying a user interface including a navigation bar at block 205 .
  • Display of the user interface can relate to display of the home or control screen of the device.
  • display of the user interface includes display of an application.
  • the navigation bar is presented to include one or more selectable elements at block 205 .
  • Display of the navigation bar may also relate to a navigation bar including a plurality of swipe areas/zones.
  • the navigation bar includes a graphical element associated with each of the plurality of swipe zones.
  • the navigation bar includes three swipe zones. It should be appreciated that the navigation bar can include configurations with two swipe areas and configurations with greater than three swipe areas.
  • process 200 includes detecting an input command relative to a display of the device.
  • Input commands detected at block 210 can be one or more of a swipe, tap, contact, drag, etc.
  • input commands detected at block 210 may input relative to the display position of the navigation bar.
  • the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar.
  • the input command at block 210 may be a touch command to the display, such that the touch command is input to a display area associated with a position of display for the navigation bar.
  • the device determines a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command.
  • navigation bar commands relate to presentation of the navigation bar, items presented in association with the navigation bar, and launching one or more items from the navigation bar.
  • the navigation bar command at block 215 expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements.
  • the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
  • the device updates presentation of the navigation bar based on the navigation bar command. Updating presentation of the navigation bar at block 220 can include expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar. According to another embodiment, updating presentation of the navigation bar at block 220 can include modifying the graphical elements displayed in the navigation bar based on the navigation bar command.
  • the navigation bar is configured to allow for quick access to several functions and provide access to one or more elements or features of a user interface.
  • navigation bar commands can relate to launching applications from the navigation bar, updating presentation of the navigation bar and/or modify selection options within the navigation bar.
  • Navigation bar commands can also allow for enlarging the display area of navigation bar and/or closing the navigation bar from display. Updating presentation of the navigation bar at block 220 can be based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
  • Process 200 may optionally include launching an application and/or device functionality at block 225 based on the navigation bar command.
  • block 225 includes controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
  • FIG. 3 depicts a simplified diagram of a device according to one or more embodiments.
  • Device 300 may relate to one or more of a media player, personal communication device, tablet, and electronic device having a touch screen in general.
  • device 300 is a standalone device.
  • device 300 is a computing device (e.g., computer, media player, etc.) configured to interoperate with another device.
  • device 300 includes controller 305 , memory 310 , optional communications unit 315 and user interface 320 .
  • Controller 305 may be configured to execute code stored in memory 310 for operation of device 300 including providing a navigation bar for device control.
  • controller 305 is configured to control display of a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones.
  • controller 305 detects input commands relative to a display of user interface 320 .
  • Input commands detected by controller 305 can include swipe commands relative to one of the plurality of swipe areas of the navigation bar. Controller 305 can then determine a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command, and control an updated presentation of the navigation bar based on the navigation bar.
  • controller 305 includes a processor and/or one or more processing elements.
  • controller 305 includes one or more of hardware, software, firmware and/or processing components in general.
  • controller 305 is configured to perform one or more processes described herein.
  • Optional communications unit 315 is configured for wired and/or wireless communication with one or more network elements, such as servers.
  • Memory 310 can include non-transitory RAM and/or ROM memory for storing executable instructions, operating instructions and content for display.
  • User interface 320 can include one or more input/output interfaces for control and/or communication.
  • device 300 relates to a device including a display as part of user interface 320 .
  • FIGS. 4A-4D depict graphical representations of device control according to one or more embodiments.
  • device 400 is depicted including display 405 and navigation bar 410 .
  • an input command 415 relative to display 405 can be characterized by device 400 to update presentation of navigation bar 410 .
  • input command 415 may be based on a touch command/swipe by a user 420 .
  • Input command 415 is shown as a vertical swipe associated with the middle of navigation bar, however, it should be appreciated that position and length of input command 415 may be characterized to determine updating for navigation bar 410 .
  • Navigation bar 410 is presented in a configuration with a straight edge to reveal a first set of graphical elements.
  • FIGS. 4B- 4D depict exemplary representations of updating navigation bar 410 displayed by display 405 of device 400 .
  • the navigation bar 410 is updated as presentation format 425 in response to initiation of a swipe and to provide a hint of the drawer type.
  • Presentation format 425 extends navigation bar in the direction of the input command.
  • Presentation format 425 extends navigation bar in the direction of the input command.
  • presentation format 425 may relate to an input command that includes a short swipe or contact with, and/or in the vicinity of, navigation bar 410 .
  • the navigation bar 410 is updated as presentation format 430 in response to an input command.
  • Presentation format 430 extends navigation bar 410 into a slide out format to reveal an additional system drawer of graphical elements 431 associated with applications or selection elements.
  • the navigation bar 410 is updated as presentation format 440 in response to an input command.
  • Presentation format 430 extends navigation bar 410 into a fully extended slide out format to reveal a scrollable list graphical elements 441 1-n which may be associated with applications of device 400 .
  • Presentation format 430 also includes a system drawer of elements 445 and an additional system drawer of graphical elements 431 associated with applications or selection elements. Updating presentation of the navigation bar 410 to presentation format 440 can be based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
  • presentation format 430 extends navigation bar 410 to reveal applications and/or advanced features that increase functionality and extend productivity of device 400 .
  • one or more of the graphical elements 431 , graphical elements 441 1-n and elements 445 can contain a set of personalized/recent applications. Similarly, frequently used applications or recommended applications may be included in navigation bar 410 and the presentation formats 425 , 430 and 440 of FIGS. 4B-4D .
  • FIGS. 5A-5C depict graphical representations of device control according to one or more embodiments.
  • a navigation bar is configured to provide an indication (e.g., hint, preview, etc.) of features that may be accessed from the notification bar. These indications may also provide serve as indications of a swipe areas/zones.
  • device 500 is depicted including display 505 and navigation bar 510 .
  • an input command relative to display 505 includes contact 511 (e.g., touch, tap, press and hold, etc.) to display 505 in the area of navigation bar 510 .
  • device 500 is configured to update the presentation of navigation bar 510 in response to contact 511 .
  • FIG. 5A depicts contact 511 on the left side of navigation bar which may be assigned to a first swipe area. Accordingly, device 500 updates navigation bar 510 to include raised feature 515 . Raised feature 515 may indicate the ability to quickly launch an application, of function of device 500 . According to another embodiment, the name of the application, or device function, may be displayed in text 516 by device 500 .
  • FIG. 5B depicts navigation bar 510 including raised feature 520 associated with a middle section of navigation bar 510 . Raised feature 520 may be in response to contact 512 associated with the middle of navigation bar 510 . The name of the application, or device function, associated with swipe zone may be displayed in text 521 by device 500 .
  • FIG. 5C depicts navigation bar 510 including raised feature 525 associated with a right section of navigation bar 510 .
  • Raised feature 525 may be in response to contact 512 associated with the right side of navigation bar 510 .
  • the name of the application, or device function, associated with swipe zone may be displayed in text 526 by device 500 .
  • device 500 may present raised features 515 , 520 and 525 and/or description text 516 , 521 and 526 for a predetermined period of time (e.g., 2-5 seconds) following contact.
  • device may present the updated presentation of the navigation bar 510 until a command is detected to remove navigation bar from view or another navigation bar command is detected. For example, device 500 may transition raised features of navigation bar 510 away on release of a touch command.
  • navigation bar 510 allows for launching a drawer or an application with a single input command.
  • navigation bar 510 is not limited to only launching a drawer, and instead allows for multiple types of applications and functions to be accessed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to device control. In one embodiment, a method for device control includes displaying a user interface including a navigation bar. The navigation bar may be presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones. The method also includes detecting an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar, and determining a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command. Presentation of the navigation bar may be updated based on the navigation bar command. Another embodiment is directed to a device configured to control operation based on navigation bar commands.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. Provisional Application No. 62/183,613 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 23, 2015, and U.S. Provisional Application No. 62/184,476 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 25, 2015, the content of which is expressly incorporated by reference in its entirety.
  • FIELD
  • The present disclosure relates to electronic devices, and more particularly to device control.
  • BACKGROUND
  • Mobile devices and personal communication devices are generally used for multiple purposes. These devices are often configured with particular forms of control, such as the inclusion of hard and soft buttons. With development of applications and device capabilities, there exists a need for device configurations that improve performance and resolve drawbacks of the conventional configurations. One area where improvements are needed is for device control configurations.
  • Regarding conventional methods, devices may require selection of a particular element to control a device. In some instances, the display element is difficult to select based on the size of a device, the display location of the element, a requirement for scrolling to the element, or even a requirement to select a button of the device. In other instances, elements may be hard to select due to configuration of content not intended for display on a device, such as non-mobile network sites designed for computer viewing.
  • The development of devices has led to increasing capability and features of a device. However, conventional methods of device control are limited. There is a desire to provide additional control functionality for devices.
  • BRIEF SUMMARY OF THE EMBODIMENTS
  • Disclosed and claimed herein are methods for device control and device configurations. One embodiment is directed to a method for device control that includes displaying, by a device, a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones. The method also includes detecting, by the device, an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar. The method also includes determining, by the device, a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command and updating presentation of the navigation bar based on the navigation bar command.
  • In one embodiment, the navigation bar includes a graphical element associated with each of the plurality of swipe zones.
  • In one embodiment, the navigation bar includes three swipe zones.
  • In one embodiment, the input command is a touch command to the display, the touch command associated with a position of display for the navigation bar.
  • In one embodiment, the navigation bar command expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements.
  • In one embodiment, the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
  • In one embodiment, updating presentation of the navigation bar includes expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar.
  • In one embodiment, updating presentation of the navigation bar includes modifying the graphical elements display in the navigation bar based on the navigation bar command.
  • In one embodiment, the method includes controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
  • In one embodiment, the method includes updating presentation of the navigation bar based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
  • Another embodiment is directed to a device including a display and a controller coupled to the display. The controller is configured to control display of a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones. The controller is also configured to detect an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar. The controller is also configured to determine a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command, and control an updated presentation of the navigation bar based on the navigation bar command.
  • Other aspects, features, and techniques will be apparent to one skilled in the relevant art in view of the following detailed description of the embodiments.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The features, objects, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:
  • FIG. 1A depicts a graphical representation of device control according to one or more embodiments;
  • FIG. 1B depicts a graphical representation of device control according to one or more other embodiments;
  • FIG. 2 depicts a process for device control according to one or more embodiments;
  • FIG. 3 depicts a simplified diagram of a device according to one or more embodiments;
  • FIGS. 4A-4D depict graphical representations of device control according to one or more embodiments; and
  • FIGS. 5A-5C depict graphical representations of device control according to one or more embodiments.
  • DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS Overview and Terminology
  • One aspect of the disclosure is directed to improving control of a device. In one embodiment, a process is provided for presenting a user interface and providing a navigation bar. As used herein, a navigation bar relates to one or more graphical elements presented on a display of a device, wherein elements of the navigation bar may be selected for control of the device and commands may be input to the device using the navigation bar. In addition, the navigation bar may be presented with a plurality of presentation formats based on input commands, navigation commands and/or one or more customized settings for the navigation bar. By configuring a navigation bar with features as described herein, device control is improved. In addition, navigation bar control allows for direct access of one or more applications and functions provided by a device.
  • According to another embodiment, a navigation bar and processes for interaction are directed to navigation bars including a plurality or swipe areas/zones. By providing multiple swipe areas, wherein each swipe area is associated with a particular or different function, functionality of the navigation is improved. In addition, providing a user interface with additional controls while minimizing control steps or screen clutter improves device control.
  • Methods and device configurations are provided for device control based on touch commands. In one embodiment, input commands may allow for updating presentation of a navigation bar to one or more presentation formats.
  • As used herein, the terms “a” or “an” shall mean one or more than one. The term “plurality” shall mean two or more than two. The term “another” is defined as a second or more. The terms “including” and/or “having” are open ended (e.g., comprising). The term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
  • Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.
  • Exemplary Embodiments
  • Referring now to the figures, FIGS. 1A-1B depict graphical representations of device control according to one or more embodiments. According to one embodiment, a device is configured to display a user interface including a navigation bar and allow for integration with the navigation bar to control device operation. A device and methods described herein provide a plurality of control inputs that enriches control features without cluttering a user interface or requiring menu driven operation. As will be discussed herein, navigation bars and input commands (e.g., touch commands) are with respect to a touch screen/touch enabled display of a device.
  • FIG. 1A depicts a graphical representation of device control for device 100. Device 100 includes display 105, which is configured for touch operation. According to one embodiment, device 100 presents navigation bar 110. In one embodiment, navigation bar 110 is presented on display 105 as part of a user interface of device 100. Navigation bar 110 may be configured for display during a home screen/control panel view of the user interface. In certain embodiments, navigation bar 110 is presented following user interaction, such as an on-screen or off screen swipe relative to display 105. According to another embodiment, navigation bar 110 is configured as a navigation drawer configured to allow for expansion of the display area.
  • According to one embodiment, navigation bar 110 is displayed including one or more selectable elements and includes a plurality of swipe zones. According to one embodiment, navigation bar 110 includes element 120 which may be selected and toggled to allow navigation bar 110 to function as a navigation drawer. In certain embodiments, navigation bar 110 includes graphical elements 115 and 125 to indicate one or more of a swipe zone and pre-assigned application accessible in navigation bar 110.
  • According to one embodiment, navigation bar 110 includes a plurality of swipes areas. Swipe areas are zones or portions of the navigation bar that allow for interaction and added control. An exemplary representation of swipe zones is shown in FIG. 1A, wherein navigation bar is divided into swipe zones 116, 121 and 126. According to one embodiment, an input command may be detected based on the location of the swipe command and a swipe zone. In one embodiment, swipe zone 116 is associated with graphical element 115, swipe zone 121 is associated with graphical element 120, and swipe zone 126 is associated with graphical element 125. Navigation bar 110 is configured to include swipe zones 116, 121 and 126, to allow for quick access to a pre-assigned application, and added control for navigation bar 110.
  • According to one embodiment, swipe zones 116, 121 and 126 may be customized areas to reveal content or advanced features and increased functionality of device 100.
  • FIG. 1B depicts a graphical representation of device 100 and input commands relative to display 105 and navigation bar 110. According to one embodiment, device 100 is configured to detect input command 130 relative to display 105. According to one embodiment, input command 130 is a swipe command relative to one of the plurality of swipe zones/ areas 116, 121 and 126 of navigation bar 110. According to another embodiment, device 100 determines a navigation bar command based on identification of a swipe zone/area of the input command 130 and a swipe length 135 of the input command 130. Exemplary swipe commands relative to swipe zones/ areas 116, 121 and 126 of navigation bar 110 are shown as 131, 132 and 133, respectively. Based on the input command 130, device 100 updates presentation of the navigation bar 110 based on the navigation bar command. Swipe commands 131, 132 and 133 are associated with customized areas to reveal content or advanced features and increased functionality. Swipe command 131 provides the capability of swiping up from the center (R0). Swipe commands 132 and 133 provide the capability of swiping up sides or corners (R1, R2). According to one embodiment, navigation bar 110 defaults to a customized left zone 116, and customized right zone 126. Zone 121 may be associated with a drawer function of device 100.
  • FIG. 2 depicts a process for device control according to one or more embodiments. According to one embodiment, process 200 is executed by device (e.g., device 100, etc.) to control operation of the device with a navigation bar. Process 200 includes displaying a user interface including a navigation bar at block 205. Display of the user interface can relate to display of the home or control screen of the device. In certain embodiments, display of the user interface includes display of an application. The navigation bar is presented to include one or more selectable elements at block 205. Display of the navigation bar may also relate to a navigation bar including a plurality of swipe areas/zones. According to another embodiment, the navigation bar includes a graphical element associated with each of the plurality of swipe zones. In one exemplary embodiment, the navigation bar includes three swipe zones. It should be appreciated that the navigation bar can include configurations with two swipe areas and configurations with greater than three swipe areas.
  • At block 210, process 200 includes detecting an input command relative to a display of the device. Input commands detected at block 210 can be one or more of a swipe, tap, contact, drag, etc. According to one embodiment, input commands detected at block 210 may input relative to the display position of the navigation bar. In one embodiment, the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar. The input command at block 210 may be a touch command to the display, such that the touch command is input to a display area associated with a position of display for the navigation bar.
  • At block 215, the device determines a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command. According to one embodiment, navigation bar commands relate to presentation of the navigation bar, items presented in association with the navigation bar, and launching one or more items from the navigation bar. In one embodiment, the navigation bar command at block 215 expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements. According to another embodiment, the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
  • At block 220 the device updates presentation of the navigation bar based on the navigation bar command. Updating presentation of the navigation bar at block 220 can include expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar. According to another embodiment, updating presentation of the navigation bar at block 220 can include modifying the graphical elements displayed in the navigation bar based on the navigation bar command.
  • In one embodiment, the navigation bar is configured to allow for quick access to several functions and provide access to one or more elements or features of a user interface. Thus, navigation bar commands can relate to launching applications from the navigation bar, updating presentation of the navigation bar and/or modify selection options within the navigation bar. Navigation bar commands can also allow for enlarging the display area of navigation bar and/or closing the navigation bar from display. Updating presentation of the navigation bar at block 220 can be based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
  • Process 200 may optionally include launching an application and/or device functionality at block 225 based on the navigation bar command. In one embodiment, block 225 includes controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
  • FIG. 3 depicts a simplified diagram of a device according to one or more embodiments. Device 300 may relate to one or more of a media player, personal communication device, tablet, and electronic device having a touch screen in general. In certain embodiments, device 300 is a standalone device. In other embodiments, device 300 is a computing device (e.g., computer, media player, etc.) configured to interoperate with another device.
  • As shown in FIG. 3, device 300 includes controller 305, memory 310, optional communications unit 315 and user interface 320. Controller 305 may be configured to execute code stored in memory 310 for operation of device 300 including providing a navigation bar for device control. In an exemplary embodiment, controller 305 is configured to control display of a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones.
  • According to one embodiment, controller 305 detects input commands relative to a display of user interface 320. Input commands detected by controller 305 can include swipe commands relative to one of the plurality of swipe areas of the navigation bar. Controller 305 can then determine a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command, and control an updated presentation of the navigation bar based on the navigation bar.
  • According to one embodiment, controller 305 includes a processor and/or one or more processing elements. In one embodiment, controller 305 includes one or more of hardware, software, firmware and/or processing components in general. According to one embodiment, controller 305 is configured to perform one or more processes described herein. Optional communications unit 315 is configured for wired and/or wireless communication with one or more network elements, such as servers. Memory 310 can include non-transitory RAM and/or ROM memory for storing executable instructions, operating instructions and content for display. User interface 320 can include one or more input/output interfaces for control and/or communication. In certain embodiments, device 300 relates to a device including a display as part of user interface 320.
  • FIGS. 4A-4D depict graphical representations of device control according to one or more embodiments. In FIG. 4A, device 400 is depicted including display 405 and navigation bar 410. According to one embodiment, an input command 415 relative to display 405 can be characterized by device 400 to update presentation of navigation bar 410. According to one embodiment, input command 415 may be based on a touch command/swipe by a user 420. Input command 415 is shown as a vertical swipe associated with the middle of navigation bar, however, it should be appreciated that position and length of input command 415 may be characterized to determine updating for navigation bar 410. Navigation bar 410 is presented in a configuration with a straight edge to reveal a first set of graphical elements. FIGS. 4B- 4D depict exemplary representations of updating navigation bar 410 displayed by display 405 of device 400.
  • In FIG. 4B, the navigation bar 410 is updated as presentation format 425 in response to initiation of a swipe and to provide a hint of the drawer type. Presentation format 425 extends navigation bar in the direction of the input command. Presentation format 425 extends navigation bar in the direction of the input command. According to one embodiment, presentation format 425 may relate to an input command that includes a short swipe or contact with, and/or in the vicinity of, navigation bar 410.
  • In FIG. 4C, the navigation bar 410 is updated as presentation format 430 in response to an input command. Presentation format 430 extends navigation bar 410 into a slide out format to reveal an additional system drawer of graphical elements 431 associated with applications or selection elements.
  • In FIG. 4D, the navigation bar 410 is updated as presentation format 440 in response to an input command. Presentation format 430 extends navigation bar 410 into a fully extended slide out format to reveal a scrollable list graphical elements 441 1-n which may be associated with applications of device 400. Presentation format 430 also includes a system drawer of elements 445 and an additional system drawer of graphical elements 431 associated with applications or selection elements. Updating presentation of the navigation bar 410 to presentation format 440 can be based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone. According to one embodiment, presentation format 430 extends navigation bar 410 to reveal applications and/or advanced features that increase functionality and extend productivity of device 400.
  • In FIGS. 4C-4D, one or more of the graphical elements 431, graphical elements 441 1-n and elements 445 can contain a set of personalized/recent applications. Similarly, frequently used applications or recommended applications may be included in navigation bar 410 and the presentation formats 425, 430 and 440 of FIGS. 4B-4D.
  • FIGS. 5A-5C depict graphical representations of device control according to one or more embodiments. In one embodiment, a navigation bar is configured to provide an indication (e.g., hint, preview, etc.) of features that may be accessed from the notification bar. These indications may also provide serve as indications of a swipe areas/zones. In FIG. 5A, device 500 is depicted including display 505 and navigation bar 510. According to one embodiment, an input command relative to display 505 includes contact 511 (e.g., touch, tap, press and hold, etc.) to display 505 in the area of navigation bar 510. According to one embodiment, device 500 is configured to update the presentation of navigation bar 510 in response to contact 511. By way of example, FIG. 5A depicts contact 511 on the left side of navigation bar which may be assigned to a first swipe area. Accordingly, device 500 updates navigation bar 510 to include raised feature 515. Raised feature 515 may indicate the ability to quickly launch an application, of function of device 500. According to another embodiment, the name of the application, or device function, may be displayed in text 516 by device 500. FIG. 5B depicts navigation bar 510 including raised feature 520 associated with a middle section of navigation bar 510. Raised feature 520 may be in response to contact 512 associated with the middle of navigation bar 510. The name of the application, or device function, associated with swipe zone may be displayed in text 521 by device 500. FIG. 5C depicts navigation bar 510 including raised feature 525 associated with a right section of navigation bar 510. Raised feature 525 may be in response to contact 512 associated with the right side of navigation bar 510. The name of the application, or device function, associated with swipe zone may be displayed in text 526 by device 500. According to one embodiment, device 500 may present raised features 515, 520 and 525 and/or description text 516, 521 and 526 for a predetermined period of time (e.g., 2-5 seconds) following contact. In other embodiments, device may present the updated presentation of the navigation bar 510 until a command is detected to remove navigation bar from view or another navigation bar command is detected. For example, device 500 may transition raised features of navigation bar 510 away on release of a touch command. In contrast to conventional touch commands for interaction with a displayed element, navigation bar 510 allows for launching a drawer or an application with a single input command. In addition, navigation bar 510 is not limited to only launching a drawer, and instead allows for multiple types of applications and functions to be accessed.
  • While this disclosure has been particularly shown and described with references to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the claimed embodiments.

Claims (20)

What is claimed is:
1. A method for device control, the method comprising:
displaying, by a device, a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones;
detecting, by the device, an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar;
determining, by the device, a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command;
updating presentation of the navigation bar based on the navigation bar command.
2. The method of claim 1, wherein the navigation bar includes a graphical element associated with each of the plurality of swipe zones.
3. The method of claim 1, wherein the navigation bar includes three swipe zones.
4. The method of claim 1, wherein the input command is a touch command to the display, the touch command associated with a position of display for the navigation bar.
5. The method of claim 1, wherein the navigation bar command expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements.
6. The method of claim 1, wherein the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
7. The method of claim 1, wherein updating presentation of the navigation bar includes expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar.
8. The method of claim 1, wherein updating presentation of the navigation bar includes modifying the graphical elements display in the navigation bar based on the navigation bar command.
9. The method of claim 1, further comprising controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
10. The method of claim 1, further comprising updating presentation of the navigation bar based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
11. A device comprising:
a display; and
a controller coupled to the display, the controller configured to
control display of a user interface including a navigation bar, wherein the navigation bar is presented to include one or more selectable elements and wherein the navigation bar includes a plurality of swipe zones;
detect an input command relative to a display of the device, wherein the input command is a swipe command relative to one of the plurality of swipe areas of the navigation bar;
determine a navigation bar command based on identification of a swipe area of the input command and a swipe length of the input command;
control an updated presentation of the navigation bar based on the navigation bar command.
12. The device of claim 11, wherein the navigation bar includes a graphical element associated with each of the plurality of swipe zones.
13. The device of claim 11, wherein the navigation bar includes three swipe zones.
14. The device of claim 11, wherein the input command is a touch command to the display, the touch command associated with a position of display for the navigation bar.
15. The device of claim 11, wherein the navigation bar command expands presentation of the navigation bar from a bottom bar to an expanded view including additional graphical elements.
16. The device of claim 11, wherein the navigation bar command launches an application associated with and preassigned to a swipe zone associated with the input command.
17. The device of claim 11, wherein updating presentation of the navigation bar includes expanding the display area of the navigation bar on the display and increasing the number of graphical elements displayed with the navigation bar.
18. The device of claim 11, wherein updating presentation of the navigation bar includes modifying the graphical elements display in the navigation bar based on the navigation bar command.
19. The device of claim 11, further comprising controlling operation of the device based on the swipe command to launch an application preassigned to a swipe zone and ending display of the navigation bar based on the navigation bar command.
20. The device of claim 11, further comprising updating presentation of the navigation bar based on the input command to include a first graphical element to identify a particular swipe zone of the navigation bar and a second graphical element to identify an application associated with the particular swipe zone.
US15/190,145 2015-06-23 2016-06-22 System and methods for navigation bar presentation and device control Abandoned US20160378281A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/190,145 US20160378281A1 (en) 2015-06-23 2016-06-22 System and methods for navigation bar presentation and device control

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562183613P 2015-06-23 2015-06-23
US201562184476P 2015-06-25 2015-06-25
US15/190,145 US20160378281A1 (en) 2015-06-23 2016-06-22 System and methods for navigation bar presentation and device control

Publications (1)

Publication Number Publication Date
US20160378281A1 true US20160378281A1 (en) 2016-12-29

Family

ID=57600988

Family Applications (8)

Application Number Title Priority Date Filing Date
US15/053,501 Abandoned US20170024086A1 (en) 2015-06-23 2016-02-25 System and methods for detection and handling of focus elements
US15/133,870 Active 2036-12-06 US10241649B2 (en) 2015-06-23 2016-04-20 System and methods for application discovery and trial
US15/133,846 Abandoned US20160381287A1 (en) 2015-06-23 2016-04-20 System and methods for controlling device operation and image capture
US15/133,859 Active 2036-12-30 US10310706B2 (en) 2015-06-23 2016-04-20 System and methods for touch target presentation
US15/169,642 Abandoned US20160378279A1 (en) 2015-06-23 2016-05-31 System and methods for device control
US15/169,634 Active 2037-01-10 US10331300B2 (en) 2015-06-23 2016-05-31 Device and methods for control including presentation of a list of selectable display elements
US15/190,145 Abandoned US20160378281A1 (en) 2015-06-23 2016-06-22 System and methods for navigation bar presentation and device control
US15/190,144 Active 2037-05-03 US10222947B2 (en) 2015-06-23 2016-06-22 Methods and devices for presenting dynamic information graphics

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US15/053,501 Abandoned US20170024086A1 (en) 2015-06-23 2016-02-25 System and methods for detection and handling of focus elements
US15/133,870 Active 2036-12-06 US10241649B2 (en) 2015-06-23 2016-04-20 System and methods for application discovery and trial
US15/133,846 Abandoned US20160381287A1 (en) 2015-06-23 2016-04-20 System and methods for controlling device operation and image capture
US15/133,859 Active 2036-12-30 US10310706B2 (en) 2015-06-23 2016-04-20 System and methods for touch target presentation
US15/169,642 Abandoned US20160378279A1 (en) 2015-06-23 2016-05-31 System and methods for device control
US15/169,634 Active 2037-01-10 US10331300B2 (en) 2015-06-23 2016-05-31 Device and methods for control including presentation of a list of selectable display elements

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/190,144 Active 2037-05-03 US10222947B2 (en) 2015-06-23 2016-06-22 Methods and devices for presenting dynamic information graphics

Country Status (1)

Country Link
US (8) US20170024086A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019047184A1 (en) * 2017-09-08 2019-03-14 广东欧珀移动通信有限公司 Information display method, apparatus, and terminal
US10261666B2 (en) * 2016-05-31 2019-04-16 Microsoft Technology Licensing, Llc Context-independent navigation of electronic content
US20190138201A1 (en) * 2017-09-11 2019-05-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method of terminal device, terminal device, and storage medium
US20200204513A1 (en) * 2017-09-08 2020-06-25 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Message Display Method, Terminal, and Storage Medium
US11372523B2 (en) * 2018-07-17 2022-06-28 Meso Scale Technologies, Llc. Graphical user interface system
US11416126B2 (en) * 2017-12-20 2022-08-16 Huawei Technologies Co., Ltd. Control method and apparatus
US11537269B2 (en) * 2019-12-27 2022-12-27 Methodical Mind, Llc. Graphical user interface system

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013166588A1 (en) 2012-05-08 2013-11-14 Bitstrips Inc. System and method for adaptable avatars
USD738889S1 (en) 2013-06-09 2015-09-15 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD771112S1 (en) * 2014-06-01 2016-11-08 Apple Inc. Display screen or portion thereof with graphical user interface
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
CN106354418B (en) * 2016-11-16 2019-07-09 腾讯科技(深圳)有限公司 A kind of control method and device based on touch screen
US10484675B2 (en) * 2017-04-16 2019-11-19 Facebook, Inc. Systems and methods for presenting content
CN110800018A (en) 2017-04-27 2020-02-14 斯纳普公司 Friend location sharing mechanism for social media platform
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11043206B2 (en) 2017-05-18 2021-06-22 Aiqudo, Inc. Systems and methods for crowdsourced actions and commands
US11056105B2 (en) 2017-05-18 2021-07-06 Aiqudo, Inc Talk back from actions in applications
US11340925B2 (en) 2017-05-18 2022-05-24 Peloton Interactive Inc. Action recipes for a crowdsourced digital assistant system
US10466963B2 (en) * 2017-05-18 2019-11-05 Aiqudo, Inc. Connecting multiple mobile devices to a smart home assistant account
US11520610B2 (en) 2017-05-18 2022-12-06 Peloton Interactive Inc. Crowdsourced on-boarding of digital assistant operations
US10572107B1 (en) 2017-06-23 2020-02-25 Amazon Technologies, Inc. Voice communication targeting user interface
US11379550B2 (en) * 2017-08-29 2022-07-05 Paypal, Inc. Seamless service on third-party sites
WO2019056393A1 (en) * 2017-09-25 2019-03-28 华为技术有限公司 Terminal interface display method and terminal
CN110442407B (en) * 2018-05-03 2021-11-26 腾讯科技(深圳)有限公司 Application program processing method and device
CN109254719A (en) * 2018-08-24 2019-01-22 维沃移动通信有限公司 A kind of processing method and mobile terminal of display interface
US11423073B2 (en) * 2018-11-16 2022-08-23 Microsoft Technology Licensing, Llc System and management of semantic indicators during document presentations
KR102657519B1 (en) * 2019-02-08 2024-04-15 삼성전자주식회사 Electronic device for providing graphic data based on voice and operating method thereof
CN110213729B (en) * 2019-05-30 2022-06-24 维沃移动通信有限公司 Message sending method and terminal
CN112433661B (en) * 2020-11-18 2022-02-11 上海幻电信息科技有限公司 Interactive object selection method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070288860A1 (en) * 1999-12-20 2007-12-13 Apple Inc. User interface for providing consolidation and access
US20140250390A1 (en) * 2011-06-03 2014-09-04 Firestorm Lab Limited Method of configuring icons in a web browser interface, and associated device and computer program product
US20140267103A1 (en) * 2013-03-15 2014-09-18 Apple Inc. Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US20140313341A1 (en) * 2010-05-14 2014-10-23 Robert Patton Stribling Systems and methods for providing event-related video sharing services
US20150331589A1 (en) * 2014-05-15 2015-11-19 Todd KAWAKITA Circular interface for navigating applications and an authentication mechanism on a wearable device
US20150350297A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Continuity
US20160062552A1 (en) * 2014-08-29 2016-03-03 Samsung Electronics Co., Ltd. Window management method and electronic device supporting the same
US20160328143A1 (en) * 2014-01-06 2016-11-10 Huawei Device Co., Ltd. Application display method and terminal

Family Cites Families (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5708709A (en) * 1995-12-08 1998-01-13 Sun Microsystems, Inc. System and method for managing try-and-buy usage of application programs
US7146381B1 (en) * 1997-02-10 2006-12-05 Actioneer, Inc. Information organization and collaboration tool for processing notes and action requests in computer systems
US5886698A (en) * 1997-04-21 1999-03-23 Sony Corporation Method for filtering search results with a graphical squeegee
US6839669B1 (en) * 1998-11-05 2005-01-04 Scansoft, Inc. Performing actions identified in recognized speech
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US7853495B2 (en) * 2001-12-28 2010-12-14 Access Co., Ltd. Usage period management system for applications
US7787908B2 (en) * 2002-03-19 2010-08-31 Qualcomm Incorporated Multi-call display management for wireless communication devices
AU2003287279A1 (en) * 2002-11-01 2004-06-07 Scott Kevin Maxwell Method and system for online software purchases
JP4215549B2 (en) * 2003-04-02 2009-01-28 富士通株式会社 Information processing device that operates in touch panel mode and pointing device mode
WO2005015379A2 (en) * 2003-08-06 2005-02-17 Koninklijke Philips Electronics N.V. A method of presenting a plurality of items
US20050055309A1 (en) * 2003-09-04 2005-03-10 Dwango North America Method and apparatus for a one click upgrade for mobile applications
US8271495B1 (en) * 2003-12-17 2012-09-18 Topix Llc System and method for automating categorization and aggregation of content from network sites
US20060063590A1 (en) * 2004-09-21 2006-03-23 Paul Abassi Mechanism to control game usage on user devices
US8102973B2 (en) * 2005-02-22 2012-01-24 Raytheon Bbn Technologies Corp. Systems and methods for presenting end to end calls and associated information
US9727082B2 (en) * 2005-04-26 2017-08-08 Apple Inc. Back-side interface for hand-held devices
US8818331B2 (en) * 2005-04-29 2014-08-26 Jasper Technologies, Inc. Method for enabling a wireless device for geographically preferential services
US7605804B2 (en) * 2005-04-29 2009-10-20 Microsoft Corporation System and method for fine cursor positioning using a low resolution imaging touch screen
GB0522079D0 (en) * 2005-10-29 2005-12-07 Griffin Ian Mobile game or program distribution
EP1796000A1 (en) * 2005-12-06 2007-06-13 International Business Machines Corporation Method, system and computer program for distributing software products in trial mode
US7958456B2 (en) * 2005-12-23 2011-06-07 Apple Inc. Scrolling list with floating adjacent index symbols
US20070233782A1 (en) * 2006-03-28 2007-10-04 Silentclick, Inc. Method & system for acquiring, storing, & managing software applications via a communications network
US9395905B2 (en) * 2006-04-05 2016-07-19 Synaptics Incorporated Graphical scroll wheel
US9106649B2 (en) * 2006-05-25 2015-08-11 Apptou Technologies Ltd Method and system for efficient remote application provision
US8611521B2 (en) * 2006-07-07 2013-12-17 Verizon Services Corp. Systems and methods for multi-media control of audio conferencing
CN101568894B (en) * 2006-10-23 2012-07-18 吴谊镇 Input device
US7961860B1 (en) * 2006-11-22 2011-06-14 Securus Technologies, Inc. Systems and methods for graphically displaying and analyzing call treatment operations
US20090037287A1 (en) * 2007-07-31 2009-02-05 Ahmad Baitalmal Software Marketplace and Distribution System
US11126321B2 (en) * 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
WO2009061796A1 (en) * 2007-11-05 2009-05-14 Collins, Tim Service management system for providing service related message prioritization in a mobile client
US8874093B2 (en) * 2007-12-13 2014-10-28 Motorola Mobility Llc Scenarios creation system for a mobile device
JPWO2010032354A1 (en) * 2008-09-22 2012-02-02 日本電気株式会社 Image object control system, image object control method and program
US8650290B2 (en) * 2008-12-19 2014-02-11 Openpeak Inc. Portable computing device and method of operation of same
US8370762B2 (en) * 2009-04-10 2013-02-05 Cellco Partnership Mobile functional icon use in operational area in touch panel devices
US20100280892A1 (en) * 2009-04-30 2010-11-04 Alcatel-Lucent Usa Inc. Method and system for targeted offers to mobile users
US20100277422A1 (en) * 2009-04-30 2010-11-04 Microsoft Corporation Touchpad display
US8346847B2 (en) * 2009-06-03 2013-01-01 Apple Inc. Installing applications based on a seed application from a separate device
US8448136B2 (en) * 2009-06-25 2013-05-21 Intuit Inc. Creating a composite program module in a computing ecosystem
US20110087975A1 (en) * 2009-10-13 2011-04-14 Sony Ericsson Mobile Communications Ab Method and arrangement in a data
US8370142B2 (en) * 2009-10-30 2013-02-05 Zipdx, Llc Real-time transcription of conference calls
US20110202864A1 (en) * 2010-02-15 2011-08-18 Hirsch Michael B Apparatus and methods of receiving and acting on user-entered information
US20110296175A1 (en) * 2010-05-25 2011-12-01 beonSoft Inc. Systems and methods for software license distribution using asymmetric key cryptography
US8650558B2 (en) * 2010-05-27 2014-02-11 Rightware, Inc. Online marketplace for pre-installed software and online services
US20110307354A1 (en) * 2010-06-09 2011-12-15 Bilgehan Erman Method and apparatus for recommending applications to mobile users
US9864501B2 (en) * 2010-07-30 2018-01-09 Apaar Tuli Displaying information
US9936333B2 (en) * 2010-08-10 2018-04-03 Microsoft Technology Licensing, Llc Location and contextual-based mobile application promotion and delivery
US8615772B2 (en) * 2010-09-28 2013-12-24 Qualcomm Incorporated Apparatus and methods of extending application services
WO2012075629A1 (en) * 2010-12-08 2012-06-14 Nokia Corporation User interface
US20120158472A1 (en) * 2010-12-21 2012-06-21 Research In Motion Limited Contextual customization of content display on a communication device
US8612874B2 (en) * 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US20120209586A1 (en) * 2011-02-16 2012-08-16 Salesforce.Com, Inc. Contextual Demonstration of Applications Hosted on Multi-Tenant Database Systems
US20120246588A1 (en) * 2011-03-21 2012-09-27 Viacom International, Inc. Cross marketing tool
JP2012212230A (en) * 2011-03-30 2012-11-01 Toshiba Corp Electronic apparatus
US8656315B2 (en) * 2011-05-27 2014-02-18 Google Inc. Moving a graphical selector
US8826190B2 (en) * 2011-05-27 2014-09-02 Google Inc. Moving a graphical selector
US9053750B2 (en) * 2011-06-17 2015-06-09 At&T Intellectual Property I, L.P. Speaker association with a visual representation of spoken content
US8577737B1 (en) * 2011-06-20 2013-11-05 A9.Com, Inc. Method, medium, and system for application lending
US20130016129A1 (en) * 2011-07-14 2013-01-17 Google Inc. Region-Specific User Input
JP5295328B2 (en) * 2011-07-29 2013-09-18 Kddi株式会社 User interface device capable of input by screen pad, input processing method and program
DE102011118367B4 (en) * 2011-08-24 2017-02-09 Deutsche Telekom Ag Method for authenticating a telecommunication terminal comprising an identity module at a server device of a telecommunication network, use of an identity module, identity module and computer program
JP2013073330A (en) * 2011-09-27 2013-04-22 Nec Casio Mobile Communications Ltd Portable electronic apparatus, touch area setting method and program
US8713560B2 (en) * 2011-12-22 2014-04-29 Sap Ag Compatibility check
TWI470475B (en) * 2012-04-17 2015-01-21 Pixart Imaging Inc Electronic system
CN102707882A (en) * 2012-04-27 2012-10-03 深圳瑞高信息技术有限公司 Method for converting control modes of application program of touch screen with virtual icons and touch screen terminal
US20130326499A1 (en) * 2012-05-31 2013-12-05 Microsoft Corporation Automatically installing and removing recommended applications
JP6071107B2 (en) * 2012-06-14 2017-02-01 裕行 池田 Mobile device
KR20140016454A (en) * 2012-07-30 2014-02-10 삼성전자주식회사 Method and apparatus for controlling drag for moving object of mobile terminal comprising touch screen
US9280789B2 (en) * 2012-08-17 2016-03-08 Google Inc. Recommending native applications
JP2014048936A (en) * 2012-08-31 2014-03-17 Omron Corp Gesture recognition device, control method thereof, display equipment, and control program
KR20140033839A (en) * 2012-09-11 2014-03-19 삼성전자주식회사 Method??for user's??interface using one hand in terminal having touchscreen and device thereof
US9280235B2 (en) * 2012-09-13 2016-03-08 Panasonic Intellectual Property Corporation Of America Portable electronic device
US20140109016A1 (en) * 2012-10-16 2014-04-17 Yu Ouyang Gesture-based cursor control
US20140184503A1 (en) * 2013-01-02 2014-07-03 Samsung Display Co., Ltd. Terminal and method for operating the same
US20140278860A1 (en) * 2013-03-15 2014-09-18 Samsung Electronics Co., Ltd. Content delivery system with content sharing mechanism and method of operation thereof
US20140330647A1 (en) * 2013-05-03 2014-11-06 International Business Machines Corporation Application and service selection for optimized promotion
US20140344041A1 (en) * 2013-05-20 2014-11-20 Cellco Partnership D/B/A Verizon Wireless Triggered mobile checkout application
US8786569B1 (en) * 2013-06-04 2014-07-22 Morton Silverberg Intermediate cursor touchscreen protocols
KR102136602B1 (en) * 2013-07-10 2020-07-22 삼성전자 주식회사 Apparatus and method for processing a content in mobile device
US9098366B1 (en) * 2013-07-11 2015-08-04 Sprint Communications Company L.P. Virtual pre-installation of applications
WO2015027199A2 (en) * 2013-08-22 2015-02-26 Naqvi Shamim A Method and system for addressing the problem of discovering relevant services and applications that are available over the internet or other communcations network
KR102009279B1 (en) * 2013-09-13 2019-08-09 엘지전자 주식회사 Mobile terminal
CN104793774A (en) * 2014-01-20 2015-07-22 联发科技(新加坡)私人有限公司 Electronic device control method
KR102105961B1 (en) * 2014-05-13 2020-05-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9871709B2 (en) * 2014-06-27 2018-01-16 Agora Lab, Inc. Systems and methods for improved quality of a visualized call over network through scenario based buffer modulation
US20170220782A1 (en) * 2014-09-08 2017-08-03 Ali ALSANOUSI Mobile interface platform systems and methods
US10176306B2 (en) * 2014-12-16 2019-01-08 JVC Kenwood Corporation Information processing apparatus, evaluation method, and storage medium for evaluating application program
US10169474B2 (en) * 2015-06-11 2019-01-01 International Business Machines Corporation Mobile application discovery using an electronic map
US10628559B2 (en) * 2015-06-23 2020-04-21 Microsoft Technology Licensing, Llc Application management
KR20170029329A (en) * 2015-09-07 2017-03-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20170337214A1 (en) * 2016-05-18 2017-11-23 Linkedin Corporation Synchronizing nearline metrics with sources of truth

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070288860A1 (en) * 1999-12-20 2007-12-13 Apple Inc. User interface for providing consolidation and access
US20140313341A1 (en) * 2010-05-14 2014-10-23 Robert Patton Stribling Systems and methods for providing event-related video sharing services
US20140250390A1 (en) * 2011-06-03 2014-09-04 Firestorm Lab Limited Method of configuring icons in a web browser interface, and associated device and computer program product
US20140267103A1 (en) * 2013-03-15 2014-09-18 Apple Inc. Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US20160328143A1 (en) * 2014-01-06 2016-11-10 Huawei Device Co., Ltd. Application display method and terminal
US20150331589A1 (en) * 2014-05-15 2015-11-19 Todd KAWAKITA Circular interface for navigating applications and an authentication mechanism on a wearable device
US20150350297A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Continuity
US20160062552A1 (en) * 2014-08-29 2016-03-03 Samsung Electronics Co., Ltd. Window management method and electronic device supporting the same

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10261666B2 (en) * 2016-05-31 2019-04-16 Microsoft Technology Licensing, Llc Context-independent navigation of electronic content
WO2019047184A1 (en) * 2017-09-08 2019-03-14 广东欧珀移动通信有限公司 Information display method, apparatus, and terminal
US20200204513A1 (en) * 2017-09-08 2020-06-25 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Message Display Method, Terminal, and Storage Medium
EP3680769A4 (en) * 2017-09-08 2020-09-02 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Information display method, apparatus, and terminal
US11194598B2 (en) * 2017-09-08 2021-12-07 Shenzhen Heytap Technology Corp., Ltd. Information display method, terminal and storage medium
US20190138201A1 (en) * 2017-09-11 2019-05-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method of terminal device, terminal device, and storage medium
US11416126B2 (en) * 2017-12-20 2022-08-16 Huawei Technologies Co., Ltd. Control method and apparatus
US11372523B2 (en) * 2018-07-17 2022-06-28 Meso Scale Technologies, Llc. Graphical user interface system
US11861145B2 (en) * 2018-07-17 2024-01-02 Methodical Mind, Llc Graphical user interface system
US11537269B2 (en) * 2019-12-27 2022-12-27 Methodical Mind, Llc. Graphical user interface system
US20230214090A1 (en) * 2019-12-27 2023-07-06 Methodical Mind, Llc. Graphical user interface system

Also Published As

Publication number Publication date
US20160378321A1 (en) 2016-12-29
US20160378278A1 (en) 2016-12-29
US10331300B2 (en) 2019-06-25
US10241649B2 (en) 2019-03-26
US10222947B2 (en) 2019-03-05
US20160381287A1 (en) 2016-12-29
US20170024086A1 (en) 2017-01-26
US20160378293A1 (en) 2016-12-29
US10310706B2 (en) 2019-06-04
US20160379395A1 (en) 2016-12-29
US20160378279A1 (en) 2016-12-29

Similar Documents

Publication Publication Date Title
US20160378281A1 (en) System and methods for navigation bar presentation and device control
US20120293406A1 (en) Method and apparatus for processing input in mobile terminal
EP2503439A1 (en) Method for positioning a cursor on a screen
KR102302233B1 (en) Method and apparatus for providing user interface
US20150277673A1 (en) Child container control of parent container of a user interface
US20140089867A1 (en) Mobile terminal having touch screen and method for displaying contents therein
TW201426500A (en) Input device, display apparatus, display system and method of controlling the same
EP2613228A1 (en) Display apparatus and method of editing displayed letters in the display apparatus
JP2007334737A (en) Information processor and information processing method
CN104915135A (en) Display control apparatus and method
KR20150143989A (en) Easy imoticon or sticker input method and apparatus implementing the same method
KR101182577B1 (en) Touch input device and method of executing instrucitn in the same
KR102536267B1 (en) Electronic device and method for displaying slider track and slider
KR102084581B1 (en) Program and mobile terminal
US20150106764A1 (en) Enhanced Input Selection
JPH09128194A (en) Display monitor device
JP5795113B1 (en) GAME CONTROL PROGRAM, GAME CONTROL METHOD, AND GAME CONTROL DEVICE
US10986393B2 (en) Display apparatus, method for UI display thereof and computer-readable recording medium
US10782854B2 (en) Display control device, storage medium, and display control method
JP6518622B2 (en) Display device, information processing device, image processing device, and image forming device
US20170220239A1 (en) Bi-Directional Control for Touch Interfaces
JP2013519249A (en) Method and apparatus for displaying an on-screen display
US20240103630A1 (en) A computer a software module arrangement, a circuitry arrangement, a user equipment and a method for an improved and extended user interface
JP5338437B2 (en) Remote control system
JP2013140492A (en) Display device, information processing apparatus, image processing apparatus, and image forming apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: HISENSE USA CORP., GEORGIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DE PAZ, ALEXANDER;SELIM, MOHAMMED;SIGNING DATES FROM 20160621 TO 20160624;REEL/FRAME:044115/0453

Owner name: JAMDEO CANADA LTD., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DE PAZ, ALEXANDER;SELIM, MOHAMMED;SIGNING DATES FROM 20160621 TO 20160624;REEL/FRAME:044115/0453

Owner name: HISENSE ELECTRIC CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DE PAZ, ALEXANDER;SELIM, MOHAMMED;SIGNING DATES FROM 20160621 TO 20160624;REEL/FRAME:044115/0453

Owner name: HISENSE INTERNATIONAL CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIRPAL, SANJIV;DE PAZ, ALEXANDER;SELIM, MOHAMMED;SIGNING DATES FROM 20160621 TO 20160624;REEL/FRAME:044115/0453

AS Assignment

Owner name: QINGDAO HISENSE ELECTRONICS CO., LTD., CHINA

Free format text: CHANGE OF NAME;ASSIGNOR:HISENSE ELECTRIC CO., LTD.;REEL/FRAME:045546/0277

Effective date: 20170822

AS Assignment

Owner name: QINGDAO HISENSE ELECTRONICS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAMDEO CANADA LTD.;HISENSE USA CORP.;HISENSE INTERNATIONAL CO., LTD.;SIGNING DATES FROM 20181114 TO 20181220;REEL/FRAME:047923/0254

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION