US20160132301A1 - Programmatic user interface generation based on display size - Google Patents
Programmatic user interface generation based on display size Download PDFInfo
- Publication number
- US20160132301A1 US20160132301A1 US14/727,226 US201514727226A US2016132301A1 US 20160132301 A1 US20160132301 A1 US 20160132301A1 US 201514727226 A US201514727226 A US 201514727226A US 2016132301 A1 US2016132301 A1 US 2016132301A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- display
- processing device
- programmed command
- definition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4092—Image resolution transcoding, e.g. client/server architecture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/16—Indexing scheme for image data processing or generation, in general involving adaptation to the client's capabilities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- Non-limiting examples of the present disclosure describe programmatic generation of a user interface for display on a processing device.
- a display class is determined from a plurality of display classes based on a detected display size of a processing device on which the user interface is to display.
- a stored user interface definition Prior to instantiating a user interface window, a stored user interface definition is identified and interpreted.
- the stored user interface definition comprises at least one programmed command object.
- a displayed user interface is instantiated on the processing device, where the displayed user interface comprises at least one user interface element.
- the user interface element is programmatically generated by translating the programmed command object of the user interface definition into the user interface element based on operations set in accordance with the determined display class.
- FIG. 1 is a block diagram illustrating an example of a computing device with which aspects of the present disclosure may be practiced.
- FIG. 3 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.
- FIG. 6 is a diagram illustrating an exemplary method for setting a layout of a user generated interface palette with which aspects of the present disclosure may be practiced.
- FIG. 9 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced.
- UI user interface
- a user interface may be viewing an application on a device having a smaller display size (e.g., mobile phone) and proceed to connect the device having the small screen display to a device having a larger display size (e.g., PC).
- Attempted resizing of an application across differing display sizes may drastically affect the display and operation of the UI for an application and/or UI control.
- systems are typically unable to recognize that a UI is to be scaled to a different programmed version to account for display size changes.
- Other instances of building UI packages may incorporate a scaling model for large and small screen devices but are only able to show a single type of UI (e.g., phone version or slate version) once an application is installed. This may limit a user's ability to connect to large display screens and enjoy UI that takes advantage of available display space.
- systems and methods describe programmatic generation of user interfaces in a form factor manner that accounts for display size of the processing device upon which an application/UI is displayed.
- Programming operations applied generate multiple sets of UI controls and layouts to use and based on the size of a display, determine which of the UI controls and layout algorithms to apply when instantiating a user interface.
- Examples of the present disclosure comprise evaluation of display class information associated with an application UI at runtime of the application to identify a class of display (e.g., large screen/tablet/slate/phablet/phone, etc.).
- Display class information may be used to determine whether to display a UI optimized for larger screen devices, smaller screen devices or something in-between.
- One or more UI layouts may be generated for each of a plurality of display classes.
- a display class is identified based on the detection of a display size of a processing device upon which an application/UI is running
- a user interface may be displayed based on a UI layout associated with determined display class. Examples described enable a UI to be tailored at run-time in a form factor manner without requiring any changes to code or a user interface definition.
- Exemplary UI examples generated provide visually different experiences tailored to the processing device than an application/UI is executing upon. At the same time, examples preserve semantic meaning of commands/user interactions and advantages of a definition-based UI.
- FIGS. 1-3 and the associated descriptions provide a discussion of a variety of operating environments in which examples of the invention may be practiced.
- the devices and systems illustrated and discussed with respect to FIGS. 1-3 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing examples of the invention, described herein.
- FIG. 1 is a block diagram illustrating physical components of a computing device 102 , for example a mobile processing device, with which examples of the present disclosure may be practiced.
- the computing device 102 may include at least one processing unit 104 and a system memory 106 .
- the system memory 106 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories.
- the system memory 106 may include an operating system 107 and one or more program modules 108 suitable for running software programs/modules 120 such as JO manager 124 , other utility 126 and application 128 .
- system memory 106 may store instructions for execution. Other examples of system memory 106 may store data associated with applications.
- the operating system 107 may be suitable for controlling the operation of the computing device 102 .
- examples of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system.
- This basic configuration is illustrated in FIG. 1 by those components within a dashed line 122 .
- the computing device 102 may have additional features or functionality.
- the computing device 102 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 1 by a removable storage device 109 and a non-removable storage device 110 .
- program modules 108 may perform processes including, but not limited to, one or more of the stages of the operations described throughout this disclosure.
- Other program modules may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, photo editing applications, authoring applications, etc.
- examples of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
- examples of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 1 may be integrated onto a single integrated circuit.
- SOC system-on-a-chip
- Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit.
- the functionality described herein may be operated via application-specific logic integrated with other components of the computing device 502 on the single integrated circuit (chip).
- Examples of the present disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
- examples of the invention may be practiced within a general purpose computer or in any other circuits or systems.
- the computing device 102 may also have one or more input device(s) 112 such as a keyboard, a mouse, a pen, a sound input device, a device for voice input/recognition, a touch input device, etc.
- the output device(s) 114 such as a display, speakers, a printer, etc. may also be included.
- the aforementioned devices are examples and others may be used.
- the computing device 104 may include one or more communication connections 116 allowing communications with other computing devices 118 . Examples of suitable communication connections 116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
- Computer readable media may include computer storage media.
- Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules.
- the system memory 106 , the removable storage device 109 , and the non-removable storage device 110 are all computer storage media examples (i.e., memory storage.)
- Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 102 . Any such computer storage media may be part of the computing device 102 .
- Computer storage media does not include a carrier wave or other propagated or modulated data signal.
- Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
- modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
- communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
- RF radio frequency
- FIGS. 2A and 2B illustrate a mobile computing device 200 , for example, a mobile telephone, a smart phone, a personal data assistant, a tablet personal computer, a phablet, a slate, a laptop computer, and the like, with which examples of the invention may be practiced.
- mobile computing device 200 may be implemented to execute applications and/or application command control.
- Application command control relates to presentation and control of commands for use with an application through a user interface (UI) or graphical user interface (GUI).
- UI user interface
- GUI graphical user interface
- application command controls may be programmed specifically to work with a single application. In other examples, application command controls may be programmed to work across more than one application.
- FIG. 2A one example of a mobile computing device 200 for implementing the examples is illustrated.
- the mobile computing device 200 is a handheld computer having both input elements and output elements.
- the mobile computing device 200 typically includes a display 205 and one or more input buttons 210 that allow the user to enter information into the mobile computing device 200 .
- the display 205 of the mobile computing device 200 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 215 allows further user input.
- the side input element 215 may be a rotary switch, a button, or any other type of manual input element.
- mobile computing device 200 may incorporate more or less input elements.
- the display 205 may not be a touch screen in some examples.
- the mobile computing device 200 is a portable phone system, such as a cellular phone.
- the mobile computing device 200 may also include an optional keypad 235 .
- Optional keypad 235 may be a physical keypad or a “soft” keypad generated on the touch screen display or any other soft input panel (SIP).
- the output elements include the display 205 for showing a GUI, a visual indicator 220 (e.g., a light emitting diode), and/or an audio transducer 225 (e.g., a speaker).
- the mobile computing device 200 incorporates a vibration transducer for providing the user with tactile feedback.
- the mobile computing device 200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
- an audio input e.g., a microphone jack
- an audio output e.g., a headphone jack
- a video output e.g., a HDMI port
- FIG. 2B is a block diagram illustrating the architecture of one example of a mobile computing device. That is, the mobile computing device 200 can incorporate a system (i.e., an architecture) 202 to implement some examples.
- the system 202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players).
- the system 202 is integrated as a computing device, such as an integrated personal digital assistant (PDA), tablet and wireless phone.
- PDA personal digital assistant
- One or more application programs 266 may be loaded into the memory 262 and run on or in association with the operating system 264 .
- Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth.
- the system 202 also includes a non-volatile storage area 268 within the memory 262 .
- the non-volatile storage area 268 may be used to store persistent information that should not be lost if the system 202 is powered down.
- the application programs 266 may use and store information in the non-volatile storage area 268 , such as e-mail or other messages used by an e-mail application, and the like.
- a synchronization application (not shown) also resides on the system 202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 268 synchronized with corresponding information stored at the host computer.
- other applications may be loaded into the memory 262 and run on the mobile computing device 200 described herein.
- the system 202 has a power supply 270 , which may be implemented as one or more batteries.
- the power supply 270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
- the system 202 may include peripheral device port 230 that performs the function of facilitating connectivity between system 202 and one or more peripheral devices. Transmissions to and from the peripheral device port 230 are conducted under control of the operating system (OS) 264 . In other words, communications received by the peripheral device port 230 may be disseminated to the application programs 266 via the operating system 264 , and vice versa.
- OS operating system
- the system 202 may also include a radio interface layer 272 that performs the function of transmitting and receiving radio frequency communications.
- the radio interface layer 272 facilitates wireless connectivity between the system 202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio interface layer 272 are conducted under control of the operating system 264 . In other words, communications received by the radio interface layer 272 may be disseminated to the application programs 266 via the operating system 264 , and vice versa.
- the visual indicator 220 may be used to provide visual notifications, and/or an audio interface 274 may be used for producing audible notifications via the audio transducer 225 .
- the visual indicator 220 is a light emitting diode (LED) and the audio transducer 225 is a speaker.
- LED light emitting diode
- the LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device.
- the audio interface 274 is used to provide audible signals to and receive audible signals from the user.
- the audio interface 274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation.
- the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below.
- the system 202 may further include a video interface 276 that enables an operation of an on-board camera 230 to record still images, video stream, and the like.
- Data/information generated or captured by the mobile computing device 200 and stored via the system 202 may be stored locally on the mobile computing device 200 , as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 272 or via a wired connection between the mobile computing device 200 and a separate computing device associated with the mobile computing device 200 , for example, a server computer in a distributed computing network, such as the Internet.
- a server computer in a distributed computing network such as the Internet.
- data/information may be accessed via the mobile computing device 200 via the radio 272 or via a distributed computing network.
- data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
- FIG. 3 illustrates one example of the architecture of a system for providing an application that reliably accesses target data on a storage system and handles communication failures to one or more client devices, as described above.
- Target data accessed, interacted with, or edited in association with programming modules 108 , applications 120 , and storage/memory may be stored in different communication channels or other storage types.
- various documents may be stored using a directory service 322 , a web portal 324 , a mailbox service 326 , an instant messaging store 328 , or a social networking site 330 , application 128 , IO manager 124 , other utility 126 , and storage systems may use any of these types of systems or the like for enabling data utilization, as described herein.
- FIG. 4 is an exemplary method 400 for managing user interface definitions with which aspects of the present disclosure may be practiced.
- method 400 may be executed by an exemplary system such as shown in FIGS. 1-3 .
- method 400 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions.
- method 400 is not limited to such examples.
- method 400 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
- System components may be utilized to perform the operations described herein with respect to method 400 .
- method 400 may be performed in associated with an application.
- An application is a software component that executes on the processing device, interfacing with hardware and software components of the device.
- An application comprises one or more programs designed to carry out operations, and an application is associated with a UI.
- a user interface (Up provides interactive or non-interactive dialog windows, task panes, etc., for managing user interaction with an application or service. Examples of tasks performed by a UI comprise presenting information and receiving input from a user.
- An example UI may contain one or more elements/controls, or groups of controls, such as, for example, push buttons, radio buttons, check boxes, edit boxes, text labels, list boxes, etc.
- An example UI includes any number of controls or other elements, such as, for example, text, vector graphics, images, video, animations, and audio.
- a UI exists separately from an application with which an exemplary UI is to be used. Alternately may be either integrated with the source code, or included with standard resources used by the application, such as dynamic linked library (DLL) files, among other examples.
- DLL dynamic linked library
- a UI may provide application command control though UI controls.
- User interface control is a graphical control element that interfaces with an application that executes on the processing device (e.g., memory, processor and functions of mobile device) and software components such as an operating system (OS), applications executing on a mobile device, programming modules, input methods (e.g., soft input panel (SIP)) and command container such as a pane or contextual menu, among other examples.
- OS operating system
- SIP soft input panel
- command container such as a pane or contextual menu
- an UI control is used to control execution of actions/commands for the application.
- An SIP is an on-screen input method for devices (e.g., text input or voice input), and a pane is a software component that assists function of other software running on the device such as the OS and other software applications, among other examples.
- a UI control may be integrated within an application. For instance, a UI control may be able to be launched, closed, expanded or minimized when an application is launched, closed, expanded or minimized.
- UI control is executable as its own application that interfaces with another application. For instance, UI control may be able to be launched, closed or minimized separately from the launching of an application that is controlled by the UI control.
- Method 400 begins at operation 402 where a programmed command object is received or updated.
- a programmed command object is code that represents a command and/or command group that is to be rendered as a UI element.
- a programmed command object may be included in a UI definition that is used to translate the programmed command objects into displayable and usable portions of a UI.
- UI elements are the displayable and usable portions of the UI that act as controls to trigger actions/commands associated with a programmed command object.
- developers may create new programmed command objects or update existing command objects for inclusion in a user interface definition that can be used to define how programmed command objects/groupings of programmed command objects are presented for display.
- developers that create programmed command objects may focus on commanding/function of a programmed command object do not necessarily need to be specify how a command object may display as it is scaled across processing devices having varying display sizes.
- programmed command objects may be included in a UI definition data and programmatically generated as UI elements in a form factor manner using other programming operations (e.g., application programming interface (API)) that evaluate a UI definition data.
- API application programming interface
- programming command objects may be created or updated.
- data associated with programming command objects may be stored in one or more memories, storages, libraries, files, or databases, etc., for use in creation of a UI definition.
- stored program command objects may be stored on a processing device upon which an application/UI is executing.
- stored program command objects may be stored on one or more separate processing devices that may be used to manage applications/UIs and UI definition data, for example.
- storages for programmed command objects e.g., database
- a UI definition is a collection of programmed command objects for commanding an application.
- the UI definition is data used to translate the programmed command objects into displayable and usable portions of an application user interface.
- User interface definitions may be data stored in any form, for example any type of file format including linked file libraries.
- a UI definition includes properties for each programmed command object, including, for example, position, dimension, visibility, text, colors, opacity, borders, accessibility information, enabled state, and other states commonly associated with conventional UI controls.
- programmed command objects specified by the UI definition files are stored (operation 408 ) in a library or database comprised of predefined controls and other elements, i.e., vector graphics, images, video, etc. as identified in operation 404 .
- a storage, memory, database etc. may contain pointers to some or all of the controls or other programmed command objects. Such an example is useful where command objects having a relatively large file size, such as a video clip, are to be included in a rendered UI window.
- the UI definition may also reference one or more event handlers or “listeners” that are to be associated with particular controls, groups of controls, other elements, or entire UI windows so that the controls, elements, or UI windows are capable of interacting with an associated application.
- these listeners are either stored along with the UI definitions, or in a separate listener memory, storage, file, database, etc.
- the controls or other elements described by the UI definition files are read from a library or database file of controls and elements, associated with the specified listeners, and then used to automatically instantiate extensible user interfaces.
- Flow may proceed to operation 408 where user definitions are stored for use in generation of user interfaces for display.
- user interface definitions or descriptions may be stored (operation 408 ) separate from the application code with which the user interfaces are intended to be used.
- User interface definitions or definition files can be changed at any time prior to running an underlying application, or alternately, at any time prior to displaying the user interface to provide uniquely configured user interface windows which are then automatically instantiated when rendering the user interface windows.
- This concept offers several major advantages. For instance, changes to any of the user interface windows do not require editing and recompiling of the associated application source code. Consequently, the number of potential errors that may be introduced into an application are dramatically reduced because the application itself is not edited to modify the UI windows associated with that application.
- UI definition files serves to allow for a common baseline application source code, regardless of the user interfaces that are associated with that application.
- the UI descriptions are stored/included (operation 408 ) in either the application code itself, or in one or more linked files, such as a DLL file, or other files which are included in the application as the application is compiled, rather than including the descriptions in separate UI definition files. Regardless of where the UI descriptions are located, UI descriptions are interpreted and treated in the same manner prior to automatic instantiation of the UI windows.
- FIG. 5 is an exemplary method 500 for presenting a user interface with which aspects of the present disclosure may be practiced.
- method 500 may be executed by an exemplary system such as shown in FIGS. 1-3 .
- method 500 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions.
- method 500 is not limited to such examples.
- method 500 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
- System components may be utilized to perform the operations described herein with respect to method 500 .
- Method 500 begins at operation 502 where a display class is determined based on a detected display size of a processing device upon which an application/UI is to be displayed.
- Display class determination provides an abstraction for determining the size of a display.
- Operations e.g., code
- a display class can be defined for processing devices having display sizes that fall within the range associated with the display class. That is, display classes act as transition points for UI experiences.
- Display class is a value that is determined based a maximum display size. The value for display class may be in any form including numeric values and elements of speech, as examples.
- display classes may be set based on numeric values. For example, a display class may be identified using numeric values (e.g., 0 to 3 inches).
- display classes are used to classify processing devices in accordance with display size. For example, a display class may be set for processing devices having a display size falling in a range from 0 to 3 inches where another display class may be set for processing devices having a display size in a range from 3.1 to 5 inches, and so on.
- a range for values of display classes may fall between 0 and infinity.
- additional display class designations may be easily added without having to change operational code behavior.
- display class designations including minimum and/or maximum values for ranges of display classes can be defined in any possible way that can be useful in defining user interface interaction.
- a minimum value of a display class may be a value that is equal to or greater than a maximum value of a display class which is directly smaller than the display class being defined.
- a first display class may correspond to a range for devices having displays between 0 and 3 inches and a minimum value of a second display class may take into account a maximum value of the first display class (e.g., 3 inches) and set the minimum value of the second display class at 3.1 inches, for instance.
- Display classes may be changes over time based on programmer prerogative, analysis/testing/use cases, etc.
- a plurality of display classes may be set and operation 502 may determine a display class from the plurality of available display classes.
- operation 502 occurs upon or before launching of a new application or UI window or when a display size change is detected such as when a display is moved to a display of a different size.
- detection of a display size and determination of a display class may occur at any time after powering on of a processing device.
- operation 502 may execute in a background of an operating system (OS) while other applications are operating in the foreground of the OS. This may minimize the delay in launching of a UI instance and improve an overall user experience.
- OS operating system
- operation 502 utilizes API accessing a shared library of data (e.g., dynamic link library (DLL) to determine a display class.
- a shared library of data e.g., dynamic link library (DLL)
- DLL dynamic link library
- exemplary operational code associated with a display class determination is not limited to but may be similar to:
- a processing device may be any device comprising a display screen, at least one memory that is configured to store operations, programs, instructions, and at least one processor that is configured to execute the operations, programs or instructions such as an application/application command control.
- Display size is a measurement of viewable area for display on a processing device. As an example, display size is a measurement associated with active viewable image size of a processing device. In other examples, display size may be associated with a nominal size value. In one example, detecting of the display size comprises detecting a measurement value for screen diagonal of a display of a processing device. However, display size determination is not limited to such an example.
- Factors that may be evaluated to determine a display size include but are not limited to: screen diagonal of a display, dot density (e.g., dots per inch (DPI), pixel density (e.g., pixels per inch (PPI), physical size of a screen/display, use case distance of a display from a user, display length, and display width, among other examples.
- display size may be a measurement value associated with effective resolution of a display for a processing device. Measurement of effective resolution enables is an example of a value used to evaluate display form factors with a common metric, and enables UI scaling to be classified into different display classes.
- any common metric relative to display size can be applied in exemplary method 500 .
- processing device orientation e.g., portrait mode, touch mode, ink mode, etc.
- processing device operational mode e.g., keyboard mode, touch mode, ink mode, etc.
- application window size e.g., screen aspect ratio, and screen effective resolution, among other examples.
- operation 502 may comprise a program instruction or module that can identify and evaluate system specifications for a processing device such as a mobile device or personal computer. For instance, a programming instruction implemented in operation 502 identifies a type or version of the processing device and executes a fetch of data to identify system information of the processing device. In another example, a programming instruction or module may reference manufacturer specification information to determine a value associated with display size of a processing device.
- Flow may proceed to operation 504 where a stored UI definition is identified and interpreted. Creating and storing of UI definitions are described in the description of method 400 of FIG. 4 .
- a UI definition may define commanding objects and groupings of commanding objects to be laid out as UI elements when a UI is generated.
- a UI definition is data used to translate the programmed command objects into displayable and usable portions of an application user interface.
- operation 504 occurs at runtime of an application/UI prior to instantiating a user interface window for display.
- operation 504 may comprise parsing the stored UI definition and generating an object model for UI commanding upon which an actual user interface can be laid out for display to a user.
- An object model is a collection of programmed command objects or groupings/classes through which programming operations can be used to parse, and manipulate the programmed command objects.
- the object model represents commanding objects and groupings of commanding objects that are to be rendered as UI.
- UI listeners may work with a generated object model and be notified of user interactions with programmed command objects.
- Operation 506 may comprise generating a user interface that comprises one or more user interface elements.
- UI elements are the displayable and usable portions of the UI that act as controls to trigger actions/commands associated with a programmed command object.
- operation 506 programmatically generates the UI element(s) by translating the programmed command object into the user interface element based on operations set in accordance with the determined display class.
- operation 506 may employ a component configured to creation a UI element that triggers an action/operation of the programmed command element, the component execution operations to create UI elements through programs/code/applications/APIs/modules, etc.
- UI elements may have varying levels of complexity depending on a type of UI element that is being created and the properties of that UI including but not limited to: actions, data and associations, button controls, combo boxes, galleries, color, effect, caching, triggers, text/font, height/width, style, alignment, focus, language, and margins, among other examples.
- UI elements may be controls generated in extensible application markup language (XAML).
- UI elements may be generated in any form including any programming language including markup languages, among other examples.
- operation 506 generates a UI and presents on a display of a processing device based on display class rules (e.g., display class rule set) for interpreting the UI definition.
- a display class rule set is a set of pre-defined dynamic layout rules for display classes that account for a detected display size of a processing device running an application/UI.
- a display class rule set may be defined for generating a UI from a stored UI definition taking into account a display class designation as determined in operation 502 .
- display class rule sets may be programmed to be associated with one or more display classes.
- a display class rule set may be programmed for each of a plurality of display classes.
- a display class rule may be programmed to cover more than one display class.
- a display class rule set may be utilized to set a UI layout for small screen processing devices or large screen processing devices, being classifications that cover multiple display classes.
- Operation 506 generates and displays a UI comprising one or more UI elements by evaluating a UI definition in accordance with display class rules.
- a generated UI is an application command control that comprises a plurality of palettes (UI palettes) programmed for application control.
- UI palettes is a hierarchical organization of UI elements including a plurality of groupings of UI elements usable for UI command control. Further description regarding palettes is provided in the description of FIGS. 6-9 .
- Dynamic layout of the automatically instantiated UI windows is accomplished by applying the display class rule set to a UI definition.
- display class layout rules are provided in a UI layout rule field or database which may be read out at the time that a UI is being instantiated.
- layout rules for display classes may be separately maintained from stored UI definitions and application code. In that instance, programmers who create and update commanding objects do not need to provide consideration for scaling of programmed commanding objects across processing devices of differing display sizes.
- Dynamic layout rules for display classes may be stored in a storage, memory, database, etc, of a processing device upon which a UI is to be instantiated or one or more other processing devices usable to manage user interface programming.
- the display class layout rules define how a programmed command object may be translated to be generated as a UI element. That is, presentation of a programmed command object may vary depending on the determined display class associated with the processing device upon which the UI is to be displayed.
- translating of the programmed command object into the user interface element may comprise setting a size of the user interface element based on the operations corresponding with the determine display class. For example, a programmed command object may appear in a first size (e.g., number of pixels, length by width, etc.) when displayed upon a processing device having a large display (identified by display class determination) and the same programmed command object may appear in a different size when displayed upon a processing device having a small display (identified by display class determination).
- a first size e.g., number of pixels, length by width, etc.
- translating of the programmed command object into the user interface element may comprise setting a style of the user interface element.
- a programmed command object may be presented as a drop-down menu in one instance (e.g., where display space may be limited) and the same programmed command object may be presented more robustly as a plurality of visible icons in another instance, depending on the layout rules defined for the display class.
- translating of the programmed command object into the user interface element may comprise setting a layout of the user interface element based on the operations corresponding with the determine display class. For example, a grouping of programmed command objects may display only in a first arrangement in one instance and the arrangement may be different in another instance. That is, layout for each displayable UI window is interpreted in light of such display class rules.
- the layout rules for any user interface can be changed, edited, or otherwise updated any time prior to rendering the UI window to provide uniquely configured user interface windows. This may be accomplished without the need to modify the underlying application code in any way.
- Display class layout rules allow for dynamic resizing and repositioning of UI windows, with a corresponding real-time automatic dynamic adjustment of UI items within already rendered UI windows.
- Flow may proceed to decision operation 508 where it is determined whether a UI definition is updated.
- display class layout rules provide for the automatic addition or removal of one or more UI elements from a displayed.
- UI window based either on user interaction with the UI window, or automatic programmatic interaction with the IA window by one or more active applications. That is, in examples, user of a UI may make changes or update a UI definition and a UI may be dynamically adjusted to account for such changes/updates. Alternatively, developers/programmers of a UI may make changes or update a UI definition and a UI may be dynamically adjusted to account for such changes updates. If operation 508 determines that a displayed definition has not been updated, flow branches NO and processing of method 500 ends.
- flow may return to operation 504 where the UI definition is identified and interpreted. Flow may proceed to operation 506 where a displayed UI is instantiated based on the updated UI definition and the display class rules.
- FIG. 6 is a diagram illustrating an exemplary method 600 for setting a layout of a generated user interface palette with which aspects of the present disclosure may be practiced.
- method 600 may be executed by an exemplary system such as shown in FIGS. 1-3 .
- method 600 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions.
- method 600 is not limited to such examples.
- method 600 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service).
- System components may be utilized to perform the operations described herein with respect to method 600 .
- a displayed UI hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes.
- An exemplary UI may comprises a plurality of palettes (UI palettes) programmed for application control.
- a palette is a hierarchical organization of UI elements including a plurality of groupings of UI elements usable for UI command control. Exemplary palettes for application command control are illustrated in FIGS. 8-9 and further described in the description of FIGS. 7-9 .
- UI palettes may be programmed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as an SIP or on screen keyboard.
- Flow of method 600 begins at operation 602 where a layout of a UI palette is set or changed based on the display class rules as applied to a UI definition.
- operation 602 may occur when a UI is being generated for display on a processing device. Dynamic layout of the automatically instantiated UI windows is accomplished by applying a display class rule set to a UI definition as described in the description of FIG. 5 .
- Setting (operation 602 ) a layout for a UI palette may further comprise operations such as operations shown in operations 604 - 610 .
- a maximum number of rows of UI elements for a palette are determined.
- Setting (operation 602 ) the layout of UI palette may comprise determining (operation 604 ) a maximum number of rows of UI elements for a UI palette displayed by the UI.
- Operation 604 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices.
- scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set.
- Operation 604 determines a maximum number of rows of commands for a palette in association with the display class rule set which is based on the display size of the processing device. Operation 602 may utilize such data in determining an optimal layout for a UI palette.
- a maximum number of UI elements per row may be determined for a UI palette.
- Setting (operation 602 ) the layout of UI palette may comprise determining (operation 606 ) a maximum number of UI elements per row for a UI palette displayed by the UI. Operation 606 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices.
- scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set.
- display class rule sets may act as scaling plateaus. For instance, evaluation of a display size of a processing device may determine that a diagonal of a display for a processing device is four inches wide, and the display of the processing device has a resolution that is 360 pixels in width by 640 pixels in length. Thus, the display class rule set may determine that for a display having an effective pixel width of 360 pixels, a maximum of 8 rows should be displayed each having a maximum of 4 commands per row. In another example, evaluation of the processing device may determine that a diagonal of a display for a processing device is 5.5 inches wide, and the display of the processing device has a resolution that is 512 pixels in width by 853 pixels in length. Thus, the application command control 402 , based on its scaling plateaus, may determine to display up to 12 rows and up to 6 commands per row of commands in palettes that are displayed.
- a height and/or width is determined for a UI palette that is being generated.
- Setting (operation 602 ) the layout of UI palette comprises determining a height/width for a UI palette displayed by the UI.
- scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set.
- the determined height is an amount of space the UI palette takes up length-wise on a display of a processing device when a UI window is in an open state displaying the UI palette.
- the determined width is an amount of space the UI palette takes up from side to side on a display of a processing device when a UI window is in an open state displaying the UI palette.
- height/width values of a palette are set by a program developer of the UI and may be specified in the display class rule set for generating a form factor UI to display on a processing device.
- users of the UI may be prevented from resizing a UI palette.
- height/width of a palette may be adjustable, for example during operation by a user.
- height/width of a UI palette may be set for cases where the UI is displayed in different states (e.g., open state, minimized state, invisible state, etc.). In different states, a height/width of one palette or all palettes displayed may be set to a predetermined value.
- a palette height may be set to a percentage (e.g., 55%) of a height of a display for the processing device when the UI is operating in an open state.
- a height of a palette may be set to different percentage (e.g., smaller percentage) of a display when the UI is operating in a minimized state.
- a height of the palette may be set as a percentage or as a fixed number of pixels. Height or width values of a palette may also be conditional based on launching of other software components by the OS such an SIP, and panes.
- conditions may be set such as when the palette sits above an SIP or below a pane, its height/width is equal to the remaining space for display of a UI palette on the processing device.
- a software component such as an SIP or pane is closed or minimized, this may adjust the height and/or width of a UI palette.
- a size, style and layout of UI elements may be determined for a UI palette.
- Setting (operation 602 ) the layout of UI palette may comprise setting (operation 610 ) of a size, style and layout of UI elements in a palette displayed by the UI. Description relates to setting of a size, style and layout of a UI element is described in operation 506 of FIG. 5 .
- Operation 610 encompasses operations described in operation 506 as applied to generation of a UI palette. Operation 610 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices.
- scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set.
- Setting (operation 610 ) of UI elements comprises setting a layout for at least one UI palettes such as a top-level palette and/or drill-in palette.
- Each of the top-level palettes and the drill-in palettes is a collection or grouping comprising one or more selectable UI elements.
- An exemplary top-level palette is shown in FIG. 8 and described in the description of FIG. 8 .
- An exemplary drill-in palette is shown in FIG. 9 and described in the description of FIG. 9 .
- Operation 610 may utilize a UI definition in determining a layout of a UI palette.
- a UI definition may specify how programmed command objects can be grouped or nested and actual layout of the command objects as UI elements may be set by applying a display class rule set associated with a determined display class.
- a UI definition may comprise command grouping data being information relating to the grouping of command objects including associations between command objects. For example, text editing features such as bolding, underlining, italicization, superscript and subscript may be associated and commonly used. Ideally, a UI window would like to include all of these commonly used functions on the same UI palette. However, due to limitations on the screen size, certain commands may need to be separated.
- Command grouping data is information that identifies associations and what commands should or should not be separated from each other. For example, it may be determined that the maximum number of rows and commands allows displaying of text formatting commands including a superscript editing command in a top-level palette but would not also allow displaying of a subscript command. Using the command grouping data, it may be identified that from a functionality and/or usability standpoint, that it is best not to separate the superscript and subscript editing commands. For instance, a user who makes a subscript text edit may later look to make a superscript edit or visa-versa.
- operation 610 may take into account command grouping data of a UI definition to determine an optimal UI display window that accounts for the display size of a device upon which the UI is displaying.
- operation 610 may display a higher-level command for text editing in a top-level palette and the superscript and subscript editing commands may be included in a drill-in palette (child palette) of that top-level palette (parent palette) so they such commands remain associated.
- flow may proceed to decision operation 612 where it is determined whether a display size change has occurred. If a display size change is not detected flow branches NO and method 600 ends. Method 600 may be implemented/re-implemented any time a UI window is to be instantiated or a display size change is detected. If a display size change is detected (e.g., resolution change or connection of another device is detected) flow branches YES and returns back to operation 602 where a layout of palettes of a UI may be set or changed.
- a display size change e.g., resolution change or connection of another device is detected
- FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced. Examples shown in FIG. 7 comprise processing devices having varying sizes and/or varying screen/display sizes, for example processing device 702 , processing device 704 , processing device 706 and processing device 708 .
- a UI control e.g., application command control
- an application/canvas are examples of components of a UI with which the present disclosure may apply.
- the UI is programmed to efficiently scale itself to utilize display space of processing devices of different sizes and/or operating size of display windows. For example, presentation of a UI control and/or application/canvas may vary across the different processing devices 702 - 708 .
- UI control and/or an application/canvas may be scaled according to a determined display class associated with a processing device.
- An application/canvas is a portion of a display of a processing device that is designated for display of an application executing on the device.
- the application/canvas region is the application UI that shows effects implemented by actions executed via a UI control. That is, the application/canvas is the content consisting of but not limited to the pages in workspace or editable portions of an application.
- a UI control hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes, chunks, and commands. Further, application command control may be programmed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as a soft input panel (SIP) or on screen keyboard. In one example, a UI control may intelligently adapt based on content of an application (e.g., displayed or selected on an application canvas).
- a UI control comprises a plurality of palettes (command palettes) programmed for application control. In one example, palettes of an application command control comprise top-level palettes and drill-in palettes.
- Each of the top-level palettes and the drill-in palettes is a collection or grouping of rows comprising one or more selectable UI elements.
- a top-level palette may comprise a highest level grouping of UI elements including commands that are more frequently used/more likely to be used by users.
- a top-level palette may display command listings that can be drilled into and displayed in drill-in palettes.
- FIG. 8 illustrates an exemplary top-level palette of an application command control.
- a drill-in palette is a collection or grouping of commands that may be used less frequently/or likely to be used less frequently compared to the commands displayed on a top-level palette.
- drill-in palettes host over-flow commands that, due to constraints resulting from a limited amount of display space for an application command control, are not included in a top-level palette.
- FIG. 9 illustrates an exemplary drill-in palette of an application command control.
- a top-level palette may comprise high-level UI elements for text editing, font editing, paragraph formatting, word finder, spell-check etc. that may be frequently used by users of an application.
- a drill-in palette for a word processing application may comprise sub-elements of such high-level commands of the top-level palette, for example, subscript or superscript commands for a font command/function.
- organization of palettes and UI elements may be editable, for example, where an element or a chunk of a palette can be pulled from one palette and added/displayed in another. For instance, an overflow command of a drill-in palette can be added to a top-level palette.
- Examples of common components that make up a top-level palette or a drill-in palette include but are not limited to: a palette bar and palette title, palette switching feature (including one touch target that launches palette switcher from title of palette bar), UI element to dismiss palette (e.g., visual representation of ellipses), quick command elements (e.g., undo or redo), palette canvas comprising a plurality of elements, chunk UI elements (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands), drill-in features to access drill-in palettes (when applicable).
- palette switching feature including one touch target that launches palette switcher from title of palette bar
- UI element to dismiss palette e.g., visual representation of ellipses
- quick command elements e.g., undo or redo
- palette canvas comprising a plurality of elements
- chunk UI elements e.g., groupings of commands
- chunk dividers e.g., dividing different groupings
- palettes of a UI are presented in a vertical layout.
- a top-level palette and a drill-in palette are vertically scrollable and comprise a collection of rows comprising one or more selectable UI elements.
- setting of the layout of a palette may also comprise presenting UI elements in a horizontal layout where palettes are horizontally scrollable.
- no limit is set on the scrollable height of a palette. Scrolling position may be kept on top-level palettes when switching between top-level palettes however scrolling position may or may not be kept for drill-in palettes.
- UI elements set and displayed may include labels identifying an element and may be configured to take up an entire row of a palette.
- multiple elements may be displayed in one row of a palette. Scaling is applied to setting and displaying commands in palette rows.
- UI elements may not have labels, for example, commands that are well known or have images displayed that are well known to users. Separators or spacers (either horizontal or vertical depending on layout of palette) may be displayed to break up different elements or chunks of elements.
- UI control is illustrated an exemplary top-level palette 802 .
- UI control is illustrated an exemplary drill-in palette 902 .
- UI control shown in FIG. 9 is a drill-in palette 902 of the UI top-level palette 802 shown in FIG. 8 .
- UI top-level palette 802 is a parent palette of the UI drill-in palette 902 (e.g., child palette of the top-level palette).
- a row showing a “font formatting” command includes a caret indicative of a drill-in feature.
- a drill-in palette of drill-in palette 902 is displayed on a display of a processing device. For instance, in FIG. 9 , font formatting sub-elements “superscript” and “subscript” are displayed. In this way, application command control and/or an application/canvas may be scaled in accordance with a determined display class associated with a processing device.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application No. 62/076,368, filed on Nov. 6, 2014, which is hereby incorporated by reference in its entirety.
- Devices such as personal computers (PCs), laptops, slates, and phones offer a wide range of screen sizes. However, there is no established method for scaling a user interface (UI) across a large range of screen sizes, from very large displays down to smaller displays. It is with respect to this general technical area that the present application is directed.
- Non-limiting examples of the present disclosure describe programmatic generation of a user interface for display on a processing device. A display class is determined from a plurality of display classes based on a detected display size of a processing device on which the user interface is to display. Prior to instantiating a user interface window, a stored user interface definition is identified and interpreted. The stored user interface definition comprises at least one programmed command object. A displayed user interface is instantiated on the processing device, where the displayed user interface comprises at least one user interface element. The user interface element is programmatically generated by translating the programmed command object of the user interface definition into the user interface element based on operations set in accordance with the determined display class.
- In other non-limiting examples, a user interface is generated that comprises a user interface control palette usable to control an application. A display class is determined from a plurality of display classes based on a detected display size of a processing device on which the user interface is to display. Prior to instantiating a user interface window, a stored user interface definition is identified and interpreted. The stored user interface definition comprises a plurality of groupings of programmed command objects. A displayed user interface is instantiated on the processing device, where the displayed user interface comprises at least one user interface control palette. The user interface control palette is programmatically generated by translating the plurality of groupings of programmed command objects of the user interface definition into a plurality of groupings of user interface elements based on operations set in accordance with the determined display class.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
- Non-limiting and non-exhaustive examples are described with reference to the following figures.
-
FIG. 1 is a block diagram illustrating an example of a computing device with which aspects of the present disclosure may be practiced. -
FIGS. 2A and 2B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced. -
FIG. 3 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced. -
FIG. 4 is an exemplary method for managing user interface definitions with which aspects of the present disclosure may be practiced. -
FIG. 5 is an exemplary method for presenting a user interface with which aspects of the present disclosure may be practiced. -
FIG. 6 is a diagram illustrating an exemplary method for setting a layout of a user generated interface palette with which aspects of the present disclosure may be practiced. -
FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced. -
FIG. 8 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced. -
FIG. 9 is a diagram illustrating user interface examples with which aspects of the present disclosure may be practiced. - Users of processing devices desire applications to be optimized in a form-factor manner. However, there is no established method of scaling a user interface (UI) across a large range of screen sizes with such an approach. Simply attempting to merge a large screen version of an application with a small screen version of an application creates complications. As an example, when large screen versions of applications are executed on devices having smaller display sizes, the UI gets too crowded and touch targets become too small. Additionally, another complication is that UIs are not traditionally scalable across devices having different display sizes. For instance, a user of a processing device may be viewing an application on a device having a smaller display size (e.g., mobile phone) and proceed to connect the device having the small screen display to a device having a larger display size (e.g., PC). Attempted resizing of an application across differing display sizes may drastically affect the display and operation of the UI for an application and/or UI control. In other cases where different versions of an application are developed (e.g., mobile version and desktop version), systems are typically unable to recognize that a UI is to be scaled to a different programmed version to account for display size changes. Other instances of building UI packages may incorporate a scaling model for large and small screen devices but are only able to show a single type of UI (e.g., phone version or slate version) once an application is installed. This may limit a user's ability to connect to large display screens and enjoy UI that takes advantage of available display space.
- In non-limiting examples, systems and methods describe programmatic generation of user interfaces in a form factor manner that accounts for display size of the processing device upon which an application/UI is displayed. Programming operations applied generate multiple sets of UI controls and layouts to use and based on the size of a display, determine which of the UI controls and layout algorithms to apply when instantiating a user interface. Examples of the present disclosure comprise evaluation of display class information associated with an application UI at runtime of the application to identify a class of display (e.g., large screen/tablet/slate/phablet/phone, etc.). Display class information may be used to determine whether to display a UI optimized for larger screen devices, smaller screen devices or something in-between. One or more UI layouts may be generated for each of a plurality of display classes. When a display class is identified based on the detection of a display size of a processing device upon which an application/UI is running, a user interface may be displayed based on a UI layout associated with determined display class. Examples described enable a UI to be tailored at run-time in a form factor manner without requiring any changes to code or a user interface definition. Exemplary UI examples generated provide visually different experiences tailored to the processing device than an application/UI is executing upon. At the same time, examples preserve semantic meaning of commands/user interactions and advantages of a definition-based UI.
- A number of technical advantages are achieved based on the present disclosure including but not limited to: improved scalability of UI for applications, consistent functionality for a UI that accounts for varying display sizes, visually appealing presentation of UI controls, enhanced processing capability across devices of varying display sizes including improved efficiency and usability for UI controls, improved efficiency in navigation and access to control content for applications/UIs, improved efficiency in programming application command objects, and improved user interaction with applications/UI controls, among other examples.
-
FIGS. 1-3 and the associated descriptions provide a discussion of a variety of operating environments in which examples of the invention may be practiced. However, the devices and systems illustrated and discussed with respect toFIGS. 1-3 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing examples of the invention, described herein. -
FIG. 1 is a block diagram illustrating physical components of acomputing device 102, for example a mobile processing device, with which examples of the present disclosure may be practiced. In a basic configuration, thecomputing device 102 may include at least oneprocessing unit 104 and asystem memory 106. Depending on the configuration and type of computing device, thesystem memory 106 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. Thesystem memory 106 may include anoperating system 107 and one ormore program modules 108 suitable for running software programs/modules 120 such as JOmanager 124,other utility 126 andapplication 128. As examples,system memory 106 may store instructions for execution. Other examples ofsystem memory 106 may store data associated with applications. Theoperating system 107, for example, may be suitable for controlling the operation of thecomputing device 102. Furthermore, examples of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated inFIG. 1 by those components within a dashedline 122. Thecomputing device 102 may have additional features or functionality. For example, thecomputing device 102 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIG. 1 by aremovable storage device 109 and anon-removable storage device 110. - As stated above, a number of program modules and data files may be stored in the
system memory 106. While executing on theprocessing unit 104, program modules 108 (e.g., Input/Output (I/O)manager 124,other utility 126 and application 128) may perform processes including, but not limited to, one or more of the stages of the operations described throughout this disclosure. Other program modules that may be used in accordance with examples of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, photo editing applications, authoring applications, etc. - Furthermore, examples of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, examples of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in
FIG. 1 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality described herein may be operated via application-specific logic integrated with other components of thecomputing device 502 on the single integrated circuit (chip). Examples of the present disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, examples of the invention may be practiced within a general purpose computer or in any other circuits or systems. - The
computing device 102 may also have one or more input device(s) 112 such as a keyboard, a mouse, a pen, a sound input device, a device for voice input/recognition, a touch input device, etc. The output device(s) 114 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. Thecomputing device 104 may include one ormore communication connections 116 allowing communications withother computing devices 118. Examples ofsuitable communication connections 116 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports. - The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The
system memory 106, theremovable storage device 109, and thenon-removable storage device 110 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by thecomputing device 102. Any such computer storage media may be part of thecomputing device 102. Computer storage media does not include a carrier wave or other propagated or modulated data signal. - Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
-
FIGS. 2A and 2B illustrate amobile computing device 200, for example, a mobile telephone, a smart phone, a personal data assistant, a tablet personal computer, a phablet, a slate, a laptop computer, and the like, with which examples of the invention may be practiced. For example,mobile computing device 200 may be implemented to execute applications and/or application command control. Application command control relates to presentation and control of commands for use with an application through a user interface (UI) or graphical user interface (GUI). In one example, application command controls may be programmed specifically to work with a single application. In other examples, application command controls may be programmed to work across more than one application. With reference toFIG. 2A , one example of amobile computing device 200 for implementing the examples is illustrated. In a basic configuration, themobile computing device 200 is a handheld computer having both input elements and output elements. Themobile computing device 200 typically includes adisplay 205 and one ormore input buttons 210 that allow the user to enter information into themobile computing device 200. Thedisplay 205 of themobile computing device 200 may also function as an input device (e.g., a touch screen display). If included, an optionalside input element 215 allows further user input. Theside input element 215 may be a rotary switch, a button, or any other type of manual input element. In alternative examples,mobile computing device 200 may incorporate more or less input elements. For example, thedisplay 205 may not be a touch screen in some examples. In yet another alternative example, themobile computing device 200 is a portable phone system, such as a cellular phone. Themobile computing device 200 may also include anoptional keypad 235.Optional keypad 235 may be a physical keypad or a “soft” keypad generated on the touch screen display or any other soft input panel (SIP). In various examples, the output elements include thedisplay 205 for showing a GUI, a visual indicator 220 (e.g., a light emitting diode), and/or an audio transducer 225 (e.g., a speaker). In some examples, themobile computing device 200 incorporates a vibration transducer for providing the user with tactile feedback. In yet another example, themobile computing device 200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device. -
FIG. 2B is a block diagram illustrating the architecture of one example of a mobile computing device. That is, themobile computing device 200 can incorporate a system (i.e., an architecture) 202 to implement some examples. In one examples, thesystem 202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some examples, thesystem 202 is integrated as a computing device, such as an integrated personal digital assistant (PDA), tablet and wireless phone. - One or
more application programs 266 may be loaded into thememory 262 and run on or in association with theoperating system 264. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. Thesystem 202 also includes anon-volatile storage area 268 within thememory 262. Thenon-volatile storage area 268 may be used to store persistent information that should not be lost if thesystem 202 is powered down. Theapplication programs 266 may use and store information in thenon-volatile storage area 268, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on thesystem 202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in thenon-volatile storage area 268 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into thememory 262 and run on themobile computing device 200 described herein. - The
system 202 has apower supply 270, which may be implemented as one or more batteries. Thepower supply 270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries. - The
system 202 may includeperipheral device port 230 that performs the function of facilitating connectivity betweensystem 202 and one or more peripheral devices. Transmissions to and from theperipheral device port 230 are conducted under control of the operating system (OS) 264. In other words, communications received by theperipheral device port 230 may be disseminated to theapplication programs 266 via theoperating system 264, and vice versa. - The
system 202 may also include aradio interface layer 272 that performs the function of transmitting and receiving radio frequency communications. Theradio interface layer 272 facilitates wireless connectivity between thesystem 202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from theradio interface layer 272 are conducted under control of theoperating system 264. In other words, communications received by theradio interface layer 272 may be disseminated to theapplication programs 266 via theoperating system 264, and vice versa. - The
visual indicator 220 may be used to provide visual notifications, and/or anaudio interface 274 may be used for producing audible notifications via theaudio transducer 225. In the illustrated example, thevisual indicator 220 is a light emitting diode (LED) and theaudio transducer 225 is a speaker. These devices may be directly coupled to thepower supply 270 so that when activated, they remain on for a duration dictated by the notification mechanism even though theprocessor 260 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. Theaudio interface 274 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to theaudio transducer 225, theaudio interface 274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with examples of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. Thesystem 202 may further include avideo interface 276 that enables an operation of an on-board camera 230 to record still images, video stream, and the like. - A
mobile computing device 200 implementing thesystem 202 may have additional features or functionality. For example, themobile computing device 200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIG. 2B by thenon-volatile storage area 268. - Data/information generated or captured by the
mobile computing device 200 and stored via thesystem 202 may be stored locally on themobile computing device 200, as described above, or the data may be stored on any number of storage media that may be accessed by the device via theradio 272 or via a wired connection between themobile computing device 200 and a separate computing device associated with themobile computing device 200, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via themobile computing device 200 via theradio 272 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems. -
FIG. 3 illustrates one example of the architecture of a system for providing an application that reliably accesses target data on a storage system and handles communication failures to one or more client devices, as described above. Target data accessed, interacted with, or edited in association withprogramming modules 108,applications 120, and storage/memory may be stored in different communication channels or other storage types. For example, various documents may be stored using adirectory service 322, aweb portal 324, amailbox service 326, aninstant messaging store 328, or asocial networking site 330,application 128,IO manager 124,other utility 126, and storage systems may use any of these types of systems or the like for enabling data utilization, as described herein. Aserver 320 may provide storage system for use by a client operating ongeneral computing device 102 and mobile device(s) 200 throughnetwork 315. By way of example,network 315 may comprise the Internet or any other type of local or wide area network, and client nodes may be implemented as acomputing device 102 embodied in a personal computer, a tablet computing device, and/or by a mobile computing device 200 (e.g., mobile processing device). Any of these examples of theclient computing device store 316. -
FIG. 4 is anexemplary method 400 for managing user interface definitions with which aspects of the present disclosure may be practiced. As an example,method 400 may be executed by an exemplary system such as shown inFIGS. 1-3 . In examples,method 400 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions. However,method 400 is not limited to such examples. In at least one example,method 400 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service). System components may be utilized to perform the operations described herein with respect tomethod 400. - In examples,
method 400 may be performed in associated with an application. An application is a software component that executes on the processing device, interfacing with hardware and software components of the device. An application comprises one or more programs designed to carry out operations, and an application is associated with a UI. A user interface (Up provides interactive or non-interactive dialog windows, task panes, etc., for managing user interaction with an application or service. Examples of tasks performed by a UI comprise presenting information and receiving input from a user. An example UI may contain one or more elements/controls, or groups of controls, such as, for example, push buttons, radio buttons, check boxes, edit boxes, text labels, list boxes, etc. An example UI includes any number of controls or other elements, such as, for example, text, vector graphics, images, video, animations, and audio. In one example, a UI exists separately from an application with which an exemplary UI is to be used. Alternately may be either integrated with the source code, or included with standard resources used by the application, such as dynamic linked library (DLL) files, among other examples. - In examples, a UI may provide application command control though UI controls. User interface control is a graphical control element that interfaces with an application that executes on the processing device (e.g., memory, processor and functions of mobile device) and software components such as an operating system (OS), applications executing on a mobile device, programming modules, input methods (e.g., soft input panel (SIP)) and command container such as a pane or contextual menu, among other examples. As an example, an UI control is used to control execution of actions/commands for the application. An SIP is an on-screen input method for devices (e.g., text input or voice input), and a pane is a software component that assists function of other software running on the device such as the OS and other software applications, among other examples. In some examples, a UI control may be integrated within an application. For instance, a UI control may be able to be launched, closed, expanded or minimized when an application is launched, closed, expanded or minimized. In other examples, UI control is executable as its own application that interfaces with another application. For instance, UI control may be able to be launched, closed or minimized separately from the launching of an application that is controlled by the UI control.
-
Method 400 begins atoperation 402 where a programmed command object is received or updated. A programmed command object is code that represents a command and/or command group that is to be rendered as a UI element. A programmed command object may be included in a UI definition that is used to translate the programmed command objects into displayable and usable portions of a UI. UI elements are the displayable and usable portions of the UI that act as controls to trigger actions/commands associated with a programmed command object. Inoperation 402, developers may create new programmed command objects or update existing command objects for inclusion in a user interface definition that can be used to define how programmed command objects/groupings of programmed command objects are presented for display. In examples, developers that create programmed command objects may focus on commanding/function of a programmed command object do not necessarily need to be specify how a command object may display as it is scaled across processing devices having varying display sizes. In that example, programmed command objects may be included in a UI definition data and programmatically generated as UI elements in a form factor manner using other programming operations (e.g., application programming interface (API)) that evaluate a UI definition data. - Once programming command objects are created or updated, flow may proceed to
operation 404 where a programming command object is stored. Inoperation 404, data associated with programming command objects may be stored in one or more memories, storages, libraries, files, or databases, etc., for use in creation of a UI definition. In one example, stored program command objects may be stored on a processing device upon which an application/UI is executing. In another example, stored program command objects may be stored on one or more separate processing devices that may be used to manage applications/UIs and UI definition data, for example. In any example, storages for programmed command objects (e.g., database) are editable in order to add or delete controls or other elements, or to change the appearance or behavior of one or more controls or other elements. - Flow may proceed to
operation 406 where a UI definition is created or updated. A UI definition is a collection of programmed command objects for commanding an application. The UI definition is data used to translate the programmed command objects into displayable and usable portions of an application user interface. User interface definitions may be data stored in any form, for example any type of file format including linked file libraries. In examples, a UI definition includes properties for each programmed command object, including, for example, position, dimension, visibility, text, colors, opacity, borders, accessibility information, enabled state, and other states commonly associated with conventional UI controls. In one example, programmed command objects specified by the UI definition files are stored (operation 408) in a library or database comprised of predefined controls and other elements, i.e., vector graphics, images, video, etc. as identified inoperation 404. Alternately, a storage, memory, database etc. may contain pointers to some or all of the controls or other programmed command objects. Such an example is useful where command objects having a relatively large file size, such as a video clip, are to be included in a rendered UI window. - In addition, the UI definition may also reference one or more event handlers or “listeners” that are to be associated with particular controls, groups of controls, other elements, or entire UI windows so that the controls, elements, or UI windows are capable of interacting with an associated application. In alternate examples, these listeners are either stored along with the UI definitions, or in a separate listener memory, storage, file, database, etc. After reading the user interface descriptions, either at or during application run time, the controls or other elements described by the UI definition files are read from a library or database file of controls and elements, associated with the specified listeners, and then used to automatically instantiate extensible user interfaces.
- Flow may proceed to operation 408 where user definitions are stored for use in generation of user interfaces for display. In one example, user interface definitions or descriptions may be stored (operation 408) separate from the application code with which the user interfaces are intended to be used. User interface definitions or definition files can be changed at any time prior to running an underlying application, or alternately, at any time prior to displaying the user interface to provide uniquely configured user interface windows which are then automatically instantiated when rendering the user interface windows. This concept offers several major advantages. For instance, changes to any of the user interface windows do not require editing and recompiling of the associated application source code. Consequently, the number of potential errors that may be introduced into an application are dramatically reduced because the application itself is not edited to modify the UI windows associated with that application. Further the use of separate UI definition files serves to allow for a common baseline application source code, regardless of the user interfaces that are associated with that application. Alternately, as noted above, the UI descriptions are stored/included (operation 408) in either the application code itself, or in one or more linked files, such as a DLL file, or other files which are included in the application as the application is compiled, rather than including the descriptions in separate UI definition files. Regardless of where the UI descriptions are located, UI descriptions are interpreted and treated in the same manner prior to automatic instantiation of the UI windows.
-
FIG. 5 is anexemplary method 500 for presenting a user interface with which aspects of the present disclosure may be practiced. As an example,method 500 may be executed by an exemplary system such as shown inFIGS. 1-3 . In examples,method 500 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions. However,method 500 is not limited to such examples. In at least one example,method 500 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service). System components may be utilized to perform the operations described herein with respect tomethod 500. -
Method 500 begins atoperation 502 where a display class is determined based on a detected display size of a processing device upon which an application/UI is to be displayed. Display class determination provides an abstraction for determining the size of a display. Operations (e.g., code) can query display class information to determine a UI instance to instantiate depending on the display size of the processing device that an application window is to execute upon. A display class can be defined for processing devices having display sizes that fall within the range associated with the display class. That is, display classes act as transition points for UI experiences. Display class is a value that is determined based a maximum display size. The value for display class may be in any form including numeric values and elements of speech, as examples. For instance, display classes may be set to correspond with different types of processing devices (e.g., laptops, PCs, tablets, phones, etc.) where an exemplary display class may be “<=Phone” or “<=Tablet”. In another example, display classes may be set based on numeric values. For example, a display class may be identified using numeric values (e.g., 0 to 3 inches). In any examples, display classes are used to classify processing devices in accordance with display size. For example, a display class may be set for processing devices having a display size falling in a range from 0 to 3 inches where another display class may be set for processing devices having a display size in a range from 3.1 to 5 inches, and so on. A range for values of display classes may fall between 0 and infinity. In one example, operations for display class determination are written in style of successive less than or equal to (<=) checks, with an else for everything greater than a defined display class. In this example, additional display class designations may be easily added without having to change operational code behavior. However, one skilled in the art will recognize that display class designations including minimum and/or maximum values for ranges of display classes can be defined in any possible way that can be useful in defining user interface interaction. In examples, a minimum value of a display class may be a value that is equal to or greater than a maximum value of a display class which is directly smaller than the display class being defined. For instance, as in an example above, a first display class may correspond to a range for devices having displays between 0 and 3 inches and a minimum value of a second display class may take into account a maximum value of the first display class (e.g., 3 inches) and set the minimum value of the second display class at 3.1 inches, for instance. Display classes may be changes over time based on programmer prerogative, analysis/testing/use cases, etc. - A plurality of display classes may be set and
operation 502 may determine a display class from the plurality of available display classes. In one example,operation 502 occurs upon or before launching of a new application or UI window or when a display size change is detected such as when a display is moved to a display of a different size. However, one skilled in the art will recognize that detection of a display size and determination of a display class may occur at any time after powering on of a processing device. For instance,operation 502 may execute in a background of an operating system (OS) while other applications are operating in the foreground of the OS. This may minimize the delay in launching of a UI instance and improve an overall user experience. - In one example,
operation 502 utilizes API accessing a shared library of data (e.g., dynamic link library (DLL) to determine a display class. As one example, exemplary operational code associated with a display class determination (e.g., display class event) is not limited to but may be similar to: -
/** Interface to register against for display class change events */ struct IDisplayClassInformation : public Mso::IRefCounted { public: /** Returns the event store for display class change events. This event store will be invoked- Whenever the running application changes to a different display with a new display-Whenever the active display changes its DPI */ virtual DisplayClassChangedEvent& DisplayClassChanged( ) = 0; virtual DisplayClass GetCurrentDisplayClass( ) const = 0; }; /** Get a DisplayClasslnformation reference on the active UI thread */ MSOCPPAPI_(Mso::TCntPtr<Mso::DisplayClassInformation::IDisplayClassInformation>) MakeDisplayClassInformation( );. - A processing device may be any device comprising a display screen, at least one memory that is configured to store operations, programs, instructions, and at least one processor that is configured to execute the operations, programs or instructions such as an application/application command control. Display size is a measurement of viewable area for display on a processing device. As an example, display size is a measurement associated with active viewable image size of a processing device. In other examples, display size may be associated with a nominal size value. In one example, detecting of the display size comprises detecting a measurement value for screen diagonal of a display of a processing device. However, display size determination is not limited to such an example. Factors that may be evaluated to determine a display size include but are not limited to: screen diagonal of a display, dot density (e.g., dots per inch (DPI), pixel density (e.g., pixels per inch (PPI), physical size of a screen/display, use case distance of a display from a user, display length, and display width, among other examples. As an example, display size may be a measurement value associated with effective resolution of a display for a processing device. Measurement of effective resolution enables is an example of a value used to evaluate display form factors with a common metric, and enables UI scaling to be classified into different display classes. However, one skilled in the art will recognize that any common metric relative to display size can be applied in
exemplary method 500. In alternative examples, other factors other than display size may impact UI adaptation. Examples include but are not limited to: processing device orientation, processing device operational mode (e.g., keyboard mode, touch mode, ink mode, etc.), application window size, screen aspect ratio, and screen effective resolution, among other examples. - In one example,
operation 502 may comprise a program instruction or module that can identify and evaluate system specifications for a processing device such as a mobile device or personal computer. For instance, a programming instruction implemented inoperation 502 identifies a type or version of the processing device and executes a fetch of data to identify system information of the processing device. In another example, a programming instruction or module may reference manufacturer specification information to determine a value associated with display size of a processing device. - Flow may proceed to operation 504 where a stored UI definition is identified and interpreted. Creating and storing of UI definitions are described in the description of
method 400 ofFIG. 4 . A UI definition may define commanding objects and groupings of commanding objects to be laid out as UI elements when a UI is generated. As described above, a UI definition is data used to translate the programmed command objects into displayable and usable portions of an application user interface. In examples, operation 504 occurs at runtime of an application/UI prior to instantiating a user interface window for display. - In examples, operation 504 may comprise parsing the stored UI definition and generating an object model for UI commanding upon which an actual user interface can be laid out for display to a user. An object model is a collection of programmed command objects or groupings/classes through which programming operations can be used to parse, and manipulate the programmed command objects. The object model represents commanding objects and groupings of commanding objects that are to be rendered as UI. In examples describe herein, UI listeners may work with a generated object model and be notified of user interactions with programmed command objects.
- Flow may proceed to
operation 506 where a user interface in instantiated and displayed on a processing device in a form factor manner.Operation 506 may comprise generating a user interface that comprises one or more user interface elements. As identified above, UI elements are the displayable and usable portions of the UI that act as controls to trigger actions/commands associated with a programmed command object. In generating a displayable UI,operation 506 programmatically generates the UI element(s) by translating the programmed command object into the user interface element based on operations set in accordance with the determined display class. In examples,operation 506 may employ a component configured to creation a UI element that triggers an action/operation of the programmed command element, the component execution operations to create UI elements through programs/code/applications/APIs/modules, etc. One skilled in the art will recognize that UI elements may have varying levels of complexity depending on a type of UI element that is being created and the properties of that UI including but not limited to: actions, data and associations, button controls, combo boxes, galleries, color, effect, caching, triggers, text/font, height/width, style, alignment, focus, language, and margins, among other examples. In one example, UI elements may be controls generated in extensible application markup language (XAML). However, one skilled in the art will recognize that UI elements may be generated in any form including any programming language including markup languages, among other examples. - In examples,
operation 506 generates a UI and presents on a display of a processing device based on display class rules (e.g., display class rule set) for interpreting the UI definition. A display class rule set is a set of pre-defined dynamic layout rules for display classes that account for a detected display size of a processing device running an application/UI. A display class rule set may be defined for generating a UI from a stored UI definition taking into account a display class designation as determined inoperation 502. In examples, display class rule sets may be programmed to be associated with one or more display classes. In one example, a display class rule set may be programmed for each of a plurality of display classes. In other examples, a display class rule may be programmed to cover more than one display class. For instance, a display class rule set may be utilized to set a UI layout for small screen processing devices or large screen processing devices, being classifications that cover multiple display classes.Operation 506 generates and displays a UI comprising one or more UI elements by evaluating a UI definition in accordance with display class rules. In one example, a generated UI is an application command control that comprises a plurality of palettes (UI palettes) programmed for application control. A palette is a hierarchical organization of UI elements including a plurality of groupings of UI elements usable for UI command control. Further description regarding palettes is provided in the description ofFIGS. 6-9 . - Dynamic layout of the automatically instantiated UI windows is accomplished by applying the display class rule set to a UI definition. In one example, display class layout rules are provided in a UI layout rule field or database which may be read out at the time that a UI is being instantiated. In examples, layout rules for display classes may be separately maintained from stored UI definitions and application code. In that instance, programmers who create and update commanding objects do not need to provide consideration for scaling of programmed commanding objects across processing devices of differing display sizes. Dynamic layout rules for display classes may be stored in a storage, memory, database, etc, of a processing device upon which a UI is to be instantiated or one or more other processing devices usable to manage user interface programming. The display class layout rules define how a programmed command object may be translated to be generated as a UI element. That is, presentation of a programmed command object may vary depending on the determined display class associated with the processing device upon which the UI is to be displayed. In
operation 506, translating of the programmed command object into the user interface element may comprise setting a size of the user interface element based on the operations corresponding with the determine display class. For example, a programmed command object may appear in a first size (e.g., number of pixels, length by width, etc.) when displayed upon a processing device having a large display (identified by display class determination) and the same programmed command object may appear in a different size when displayed upon a processing device having a small display (identified by display class determination). Inoperation 506, translating of the programmed command object into the user interface element may comprise setting a style of the user interface element. For example, a programmed command object may be presented as a drop-down menu in one instance (e.g., where display space may be limited) and the same programmed command object may be presented more robustly as a plurality of visible icons in another instance, depending on the layout rules defined for the display class. Inoperation 506, translating of the programmed command object into the user interface element may comprise setting a layout of the user interface element based on the operations corresponding with the determine display class. For example, a grouping of programmed command objects may display only in a first arrangement in one instance and the arrangement may be different in another instance. That is, layout for each displayable UI window is interpreted in light of such display class rules. - Consequently, in examples, the layout rules for any user interface can be changed, edited, or otherwise updated any time prior to rendering the UI window to provide uniquely configured user interface windows. This may be accomplished without the need to modify the underlying application code in any way. Display class layout rules allow for dynamic resizing and repositioning of UI windows, with a corresponding real-time automatic dynamic adjustment of UI items within already rendered UI windows.
- Flow may proceed to
decision operation 508 where it is determined whether a UI definition is updated. In one example, display class layout rules provide for the automatic addition or removal of one or more UI elements from a displayed. UI window based either on user interaction with the UI window, or automatic programmatic interaction with the IA window by one or more active applications. That is, in examples, user of a UI may make changes or update a UI definition and a UI may be dynamically adjusted to account for such changes/updates. Alternatively, developers/programmers of a UI may make changes or update a UI definition and a UI may be dynamically adjusted to account for such changes updates. Ifoperation 508 determines that a displayed definition has not been updated, flow branches NO and processing ofmethod 500 ends. In an example whereoperation 508 determines that a UI definition has changed, flow may return to operation 504 where the UI definition is identified and interpreted. Flow may proceed tooperation 506 where a displayed UI is instantiated based on the updated UI definition and the display class rules. -
FIG. 6 is a diagram illustrating anexemplary method 600 for setting a layout of a generated user interface palette with which aspects of the present disclosure may be practiced. As an example,method 600 may be executed by an exemplary system such as shown inFIGS. 1-3 . In examples,method 600 may be executed on a device comprising at least one processor configured to store and execute operations, programs or instructions. However,method 600 is not limited to such examples. In at least one example,method 600 may be executed (e.g., computer-implemented operations) by one or more components of a processing device or components of a distributed network, for instance, web service/distributed network service (e.g. cloud service). System components may be utilized to perform the operations described herein with respect tomethod 600. - In examples associated with the present disclosure, a displayed UI (e.g., application command control) hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes. An exemplary UI may comprises a plurality of palettes (UI palettes) programmed for application control. A palette is a hierarchical organization of UI elements including a plurality of groupings of UI elements usable for UI command control. Exemplary palettes for application command control are illustrated in
FIGS. 8-9 and further described in the description ofFIGS. 7-9 . Further, UI palettes may be programmed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as an SIP or on screen keyboard. - Flow of
method 600 begins atoperation 602 where a layout of a UI palette is set or changed based on the display class rules as applied to a UI definition. In examples,operation 602 may occur when a UI is being generated for display on a processing device. Dynamic layout of the automatically instantiated UI windows is accomplished by applying a display class rule set to a UI definition as described in the description ofFIG. 5 . Setting (operation 602) a layout for a UI palette may further comprise operations such as operations shown in operations 604-610. - In
operation 604, a maximum number of rows of UI elements for a palette are determined. Setting (operation 602) the layout of UI palette may comprise determining (operation 604) a maximum number of rows of UI elements for a UI palette displayed by the UI.Operation 604 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices. Inoperation 604, scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set.Operation 604 determines a maximum number of rows of commands for a palette in association with the display class rule set which is based on the display size of the processing device.Operation 602 may utilize such data in determining an optimal layout for a UI palette. - In
operation 606, a maximum number of UI elements per row may be determined for a UI palette. Setting (operation 602) the layout of UI palette may comprise determining (operation 606) a maximum number of UI elements per row for a UI palette displayed by the UI.Operation 606 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices. Inoperation 606, scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set. - As an example, display class rule sets may act as scaling plateaus. For instance, evaluation of a display size of a processing device may determine that a diagonal of a display for a processing device is four inches wide, and the display of the processing device has a resolution that is 360 pixels in width by 640 pixels in length. Thus, the display class rule set may determine that for a display having an effective pixel width of 360 pixels, a maximum of 8 rows should be displayed each having a maximum of 4 commands per row. In another example, evaluation of the processing device may determine that a diagonal of a display for a processing device is 5.5 inches wide, and the display of the processing device has a resolution that is 512 pixels in width by 853 pixels in length. Thus, the
application command control 402, based on its scaling plateaus, may determine to display up to 12 rows and up to 6 commands per row of commands in palettes that are displayed. - In
operation 608, a height and/or width is determined for a UI palette that is being generated. Setting (operation 602) the layout of UI palette comprises determining a height/width for a UI palette displayed by the UI. Inoperation 608, scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set. The determined height is an amount of space the UI palette takes up length-wise on a display of a processing device when a UI window is in an open state displaying the UI palette. The determined width is an amount of space the UI palette takes up from side to side on a display of a processing device when a UI window is in an open state displaying the UI palette. In one example, height/width values of a palette are set by a program developer of the UI and may be specified in the display class rule set for generating a form factor UI to display on a processing device. In one example, users of the UI may be prevented from resizing a UI palette. In another example, height/width of a palette may be adjustable, for example during operation by a user. In any example, height/width of a UI palette may be set for cases where the UI is displayed in different states (e.g., open state, minimized state, invisible state, etc.). In different states, a height/width of one palette or all palettes displayed may be set to a predetermined value. For instance, a palette height may be set to a percentage (e.g., 55%) of a height of a display for the processing device when the UI is operating in an open state. In another example, a height of a palette may be set to different percentage (e.g., smaller percentage) of a display when the UI is operating in a minimized state. In examples, a height of the palette may be set as a percentage or as a fixed number of pixels. Height or width values of a palette may also be conditional based on launching of other software components by the OS such an SIP, and panes. For example, conditions may be set such as when the palette sits above an SIP or below a pane, its height/width is equal to the remaining space for display of a UI palette on the processing device. When a software component such as an SIP or pane is closed or minimized, this may adjust the height and/or width of a UI palette. - In
operation 610, a size, style and layout of UI elements may be determined for a UI palette. Setting (operation 602) the layout of UI palette may comprise setting (operation 610) of a size, style and layout of UI elements in a palette displayed by the UI. Description relates to setting of a size, style and layout of a UI element is described inoperation 506 ofFIG. 5 .Operation 610 encompasses operations described inoperation 506 as applied to generation of a UI palette.Operation 610 enables an UI developer to best layout palettes and UI elements for users given the limited space that may available on processing devices. Inoperation 610, scaling for display of UI elements in UI palettes may be a programming operation associated with a display class rule set. Setting (operation 610) of UI elements comprises setting a layout for at least one UI palettes such as a top-level palette and/or drill-in palette. Each of the top-level palettes and the drill-in palettes is a collection or grouping comprising one or more selectable UI elements. An exemplary top-level palette is shown inFIG. 8 and described in the description ofFIG. 8 . An exemplary drill-in palette is shown inFIG. 9 and described in the description ofFIG. 9 . -
Operation 610 may utilize a UI definition in determining a layout of a UI palette. A UI definition may specify how programmed command objects can be grouped or nested and actual layout of the command objects as UI elements may be set by applying a display class rule set associated with a determined display class. In examples, a UI definition may comprise command grouping data being information relating to the grouping of command objects including associations between command objects. For example, text editing features such as bolding, underlining, italicization, superscript and subscript may be associated and commonly used. Ideally, a UI window would like to include all of these commonly used functions on the same UI palette. However, due to limitations on the screen size, certain commands may need to be separated. Command grouping data is information that identifies associations and what commands should or should not be separated from each other. For example, it may be determined that the maximum number of rows and commands allows displaying of text formatting commands including a superscript editing command in a top-level palette but would not also allow displaying of a subscript command. Using the command grouping data, it may be identified that from a functionality and/or usability standpoint, that it is best not to separate the superscript and subscript editing commands. For instance, a user who makes a subscript text edit may later look to make a superscript edit or visa-versa. Thus, in setting the layout of commands for palettes,operation 610 may take into account command grouping data of a UI definition to determine an optimal UI display window that accounts for the display size of a device upon which the UI is displaying. In one example,operation 610 may display a higher-level command for text editing in a top-level palette and the superscript and subscript editing commands may be included in a drill-in palette (child palette) of that top-level palette (parent palette) so they such commands remain associated. - Once a UI palette is set and instantiated, flow may proceed to
decision operation 612 where it is determined whether a display size change has occurred. If a display size change is not detected flow branches NO andmethod 600 ends.Method 600 may be implemented/re-implemented any time a UI window is to be instantiated or a display size change is detected. If a display size change is detected (e.g., resolution change or connection of another device is detected) flow branches YES and returns back tooperation 602 where a layout of palettes of a UI may be set or changed. -
FIG. 7 is a diagram illustrating display for exemplary processing devices of different sizes with which aspects of the present disclosure may be practiced. Examples shown inFIG. 7 comprise processing devices having varying sizes and/or varying screen/display sizes, forexample processing device 702,processing device 704,processing device 706 andprocessing device 708. - As shown in
FIG. 7 , a UI control (e.g., application command control) and an application/canvas are displayed in exemplary processing devices 702-708. An application command control and an application/canvas are examples of components of a UI with which the present disclosure may apply. In examples, the UI is programmed to efficiently scale itself to utilize display space of processing devices of different sizes and/or operating size of display windows. For example, presentation of a UI control and/or application/canvas may vary across the different processing devices 702-708. UI control and/or an application/canvas may be scaled according to a determined display class associated with a processing device. - An application/canvas is a portion of a display of a processing device that is designated for display of an application executing on the device. The application/canvas region is the application UI that shows effects implemented by actions executed via a UI control. That is, the application/canvas is the content consisting of but not limited to the pages in workspace or editable portions of an application.
- A UI control (application command control) hosts a majority of an application's command set, organized in a hierarchical structure of individual palettes, chunks, and commands. Further, application command control may be programmed to dynamically interact with an application and display simultaneously with applications and/or user interface components such as a soft input panel (SIP) or on screen keyboard. In one example, a UI control may intelligently adapt based on content of an application (e.g., displayed or selected on an application canvas). A UI control comprises a plurality of palettes (command palettes) programmed for application control. In one example, palettes of an application command control comprise top-level palettes and drill-in palettes. Each of the top-level palettes and the drill-in palettes is a collection or grouping of rows comprising one or more selectable UI elements. As an example, a top-level palette may comprise a highest level grouping of UI elements including commands that are more frequently used/more likely to be used by users. A top-level palette may display command listings that can be drilled into and displayed in drill-in palettes.
FIG. 8 illustrates an exemplary top-level palette of an application command control. A drill-in palette is a collection or grouping of commands that may be used less frequently/or likely to be used less frequently compared to the commands displayed on a top-level palette. As an example, drill-in palettes host over-flow commands that, due to constraints resulting from a limited amount of display space for an application command control, are not included in a top-level palette.FIG. 9 illustrates an exemplary drill-in palette of an application command control. Using a word processing application as an exemplary application, a top-level palette may comprise high-level UI elements for text editing, font editing, paragraph formatting, word finder, spell-check etc. that may be frequently used by users of an application. As an example, a drill-in palette for a word processing application may comprise sub-elements of such high-level commands of the top-level palette, for example, subscript or superscript commands for a font command/function. In examples, organization of palettes and UI elements may be editable, for example, where an element or a chunk of a palette can be pulled from one palette and added/displayed in another. For instance, an overflow command of a drill-in palette can be added to a top-level palette. - Examples of common components that make up a top-level palette or a drill-in palette include but are not limited to: a palette bar and palette title, palette switching feature (including one touch target that launches palette switcher from title of palette bar), UI element to dismiss palette (e.g., visual representation of ellipses), quick command elements (e.g., undo or redo), palette canvas comprising a plurality of elements, chunk UI elements (e.g., groupings of commands) and chunk dividers (e.g., dividing different groupings of commands), drill-in features to access drill-in palettes (when applicable).
- In one example, palettes of a UI are presented in a vertical layout. For example, a top-level palette and a drill-in palette are vertically scrollable and comprise a collection of rows comprising one or more selectable UI elements. However, in other examples, setting of the layout of a palette may also comprise presenting UI elements in a horizontal layout where palettes are horizontally scrollable. In some examples, no limit is set on the scrollable height of a palette. Scrolling position may be kept on top-level palettes when switching between top-level palettes however scrolling position may or may not be kept for drill-in palettes. UI elements set and displayed may include labels identifying an element and may be configured to take up an entire row of a palette. In other examples, multiple elements may be displayed in one row of a palette. Scaling is applied to setting and displaying commands in palette rows. In some other examples, UI elements may not have labels, for example, commands that are well known or have images displayed that are well known to users. Separators or spacers (either horizontal or vertical depending on layout of palette) may be displayed to break up different elements or chunks of elements.
- In
FIG. 8 , UI control is illustrated an exemplary top-level palette 802. InFIG. 9 , UI control is illustrated an exemplary drill-inpalette 902. For example, UI control shown inFIG. 9 is a drill-inpalette 902 of the UI top-level palette 802 shown inFIG. 8 . That is, UI top-level palette 802 is a parent palette of the UI drill-in palette 902 (e.g., child palette of the top-level palette). As shown in top-level palette 802, a row showing a “font formatting” command includes a caret indicative of a drill-in feature. When the drill-in feature is selected, a drill-in palette of drill-inpalette 902 is displayed on a display of a processing device. For instance, inFIG. 9 , font formatting sub-elements “superscript” and “subscript” are displayed. In this way, application command control and/or an application/canvas may be scaled in accordance with a determined display class associated with a processing device. - Reference has been made throughout this specification to “one example” or “an example,” meaning that a particular described feature, structure, or characteristic is included in at least one example. Thus, usage of such phrases may refer to more than just one example. Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more examples.
- One skilled in the relevant art may recognize, however, that the examples may be practiced without one or more of the specific details, or with other methods, resources, materials, etc. In other instances, well known structures, resources, or operations have not been shown or described in detail merely to observe obscuring aspects of the examples.
- While sample examples and applications have been illustrated and described, it is to be understood that the examples are not limited to the precise configuration and resources described above. Various modifications, changes, and variations apparent to those skilled in the art may be made in the arrangement, operation, and details of the methods and systems disclosed herein without departing from the scope of the claimed examples.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/727,226 US20160132301A1 (en) | 2014-11-06 | 2015-06-01 | Programmatic user interface generation based on display size |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462076368P | 2014-11-06 | 2014-11-06 | |
US14/727,226 US20160132301A1 (en) | 2014-11-06 | 2015-06-01 | Programmatic user interface generation based on display size |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160132301A1 true US20160132301A1 (en) | 2016-05-12 |
Family
ID=55912229
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/727,226 Abandoned US20160132301A1 (en) | 2014-11-06 | 2015-06-01 | Programmatic user interface generation based on display size |
US14/726,868 Abandoned US20160132992A1 (en) | 2014-11-06 | 2015-06-01 | User interface scaling for devices based on display size |
US14/840,360 Active 2037-07-24 US11126329B2 (en) | 2014-11-06 | 2015-08-31 | Application command control for smaller screen display |
US14/880,768 Active 2038-01-03 US11422681B2 (en) | 2014-11-06 | 2015-10-12 | User interface for application command control |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/726,868 Abandoned US20160132992A1 (en) | 2014-11-06 | 2015-06-01 | User interface scaling for devices based on display size |
US14/840,360 Active 2037-07-24 US11126329B2 (en) | 2014-11-06 | 2015-08-31 | Application command control for smaller screen display |
US14/880,768 Active 2038-01-03 US11422681B2 (en) | 2014-11-06 | 2015-10-12 | User interface for application command control |
Country Status (2)
Country | Link |
---|---|
US (4) | US20160132301A1 (en) |
WO (1) | WO2017065988A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170054831A1 (en) * | 2015-08-21 | 2017-02-23 | Adobe Systems Incorporated | Cloud-based storage and interchange mechanism for design elements |
USD801995S1 (en) * | 2015-03-06 | 2017-11-07 | Samsung Electronics Co., Ltd | Display screen or portion thereof with graphical user interface |
US20170357424A1 (en) * | 2016-06-10 | 2017-12-14 | Apple Inc. | Editing inherited configurations |
US10228835B2 (en) * | 2016-12-23 | 2019-03-12 | Beijing Kingsoft Internet Security Software Co., Ltd. | Method for displaying information, and terminal equipment |
US20190087389A1 (en) * | 2017-09-18 | 2019-03-21 | Elutions IP Holdings S.à.r.l. | Systems and methods for configuring display layout |
US10496241B2 (en) | 2015-08-21 | 2019-12-03 | Adobe Inc. | Cloud-based inter-application interchange of style information |
US20200133644A1 (en) * | 2018-10-31 | 2020-04-30 | Salesforce.Com, Inc. | Automatic Classification of User Interface Elements |
US10725632B2 (en) | 2013-03-15 | 2020-07-28 | Microsoft Technology Licensing, Llc | In-place contextual menu for handling actions for a listing of items |
US10877635B2 (en) | 2017-05-10 | 2020-12-29 | Embee Mobile, Inc. | System and method for the capture of mobile behavior, usage, or content exposure |
US10949075B2 (en) | 2014-11-06 | 2021-03-16 | Microsoft Technology Licensing, Llc | Application command control for small screen display |
US11126329B2 (en) | 2014-11-06 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application command control for smaller screen display |
US11809217B2 (en) * | 2017-06-16 | 2023-11-07 | Microsoft Technology Licensing, Llc | Rules based user interface generation |
US11816459B2 (en) * | 2016-11-16 | 2023-11-14 | Native Ui, Inc. | Graphical user interface programming system |
US11928417B2 (en) * | 2016-06-10 | 2024-03-12 | Truecontext Inc. | Flexible online form display |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD847202S1 (en) * | 2013-06-28 | 2019-04-30 | Michael Flynn | Portion of a communications terminal display screen with a dynamic icon |
USD780198S1 (en) * | 2013-09-18 | 2017-02-28 | Lenovo (Beijing) Co., Ltd. | Display screen with graphical user interface |
US10303350B2 (en) * | 2015-05-20 | 2019-05-28 | Hubin Jiang | Systems and methods for generating online documents |
CN104954869B (en) * | 2015-05-22 | 2018-05-29 | 合肥杰发科技有限公司 | Multi-medium play method and device based on Android system |
USD795891S1 (en) * | 2015-11-09 | 2017-08-29 | Aetna Inc. | Computer display screen for a server maintenance tool with graphical user interface |
USD786890S1 (en) * | 2015-11-09 | 2017-05-16 | Aetna Inc. | Computer display screen for a server maintenance tool with graphical user interface |
USD772250S1 (en) * | 2015-11-09 | 2016-11-22 | Aetna Inc. | Computer display for a server maintenance tool graphical user interface |
USD808410S1 (en) * | 2016-06-03 | 2018-01-23 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD809557S1 (en) | 2016-06-03 | 2018-02-06 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
USD916762S1 (en) * | 2016-07-14 | 2021-04-20 | Nasdaq, Inc. | Display screen or portion thereof with animated graphical user interface |
USD816710S1 (en) * | 2016-07-20 | 2018-05-01 | Multilearning Group, Inc. | Mobile device display screen with transitional graphical user interface |
US10409487B2 (en) * | 2016-08-23 | 2019-09-10 | Microsoft Technology Licensing, Llc | Application processing based on gesture input |
JP7177775B2 (en) * | 2016-09-14 | 2022-11-24 | ピーティーアイ マーケティング テクノロジーズ インコーポレイテッド | System and method for automatically reformatting publications |
USD817350S1 (en) * | 2016-11-22 | 2018-05-08 | Otis Elevator Company | Display screen or portion thereof with graphical user interface |
CN107272984A (en) * | 2017-05-19 | 2017-10-20 | 北京金山安全软件有限公司 | Application icon preview method and device and electronic equipment |
USD862509S1 (en) * | 2017-08-23 | 2019-10-08 | Amazon Technologies, Inc. | Display screen or portion thereof having a graphical user interface |
KR102029980B1 (en) * | 2017-08-31 | 2019-10-08 | 한국전자통신연구원 | Apparatus and method of generating alternative text |
GB2566949B (en) * | 2017-09-27 | 2020-09-09 | Avecto Ltd | Computer device and method for managing privilege delegation |
USD936671S1 (en) * | 2017-10-23 | 2021-11-23 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD844637S1 (en) * | 2018-01-17 | 2019-04-02 | Apple Inc. | Electronic device with animated graphical user interface |
USD877752S1 (en) | 2018-03-16 | 2020-03-10 | Magic Leap, Inc. | Display panel or portion thereof with graphical user interface |
USD853428S1 (en) * | 2018-03-28 | 2019-07-09 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD852835S1 (en) * | 2018-03-28 | 2019-07-02 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD853426S1 (en) * | 2018-03-28 | 2019-07-09 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD853427S1 (en) * | 2018-03-28 | 2019-07-09 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD845987S1 (en) * | 2018-03-28 | 2019-04-16 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD852833S1 (en) * | 2018-03-28 | 2019-07-02 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD845988S1 (en) * | 2018-03-28 | 2019-04-16 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
USD852834S1 (en) * | 2018-03-28 | 2019-07-02 | Manitowoc Crane Companies, Llc | Mobile communication device display screen or portion thereof with graphical user interface |
CN108549522A (en) * | 2018-03-30 | 2018-09-18 | 深圳市万普拉斯科技有限公司 | It takes pictures setting method, device, mobile terminal and computer readable storage medium |
WO2020018592A1 (en) | 2018-07-17 | 2020-01-23 | Methodical Mind, Llc. | Graphical user interface system |
CN109308205B (en) | 2018-08-09 | 2020-12-01 | 腾讯科技(深圳)有限公司 | Display adaptation method, device, equipment and storage medium of application program |
US11017045B2 (en) * | 2018-11-19 | 2021-05-25 | Microsoft Technology Licensing, Llc | Personalized user experience and search-based recommendations |
USD920996S1 (en) * | 2019-02-22 | 2021-06-01 | Teva Branded Pharmaceutical Products R&D, Inc. | Display screen with a graphical user interface |
USD913301S1 (en) * | 2019-02-22 | 2021-03-16 | Teva Branded Pharmaceutical Products R&D, Inc. | Display screen with a graphical user interface |
WO2021085663A1 (en) * | 2019-10-29 | 2021-05-06 | 엘지전자 주식회사 | Electronic device for driving application, and control method therefor |
AU2021211470A1 (en) * | 2020-01-22 | 2022-09-15 | Methodical Mind, Llc. | Graphical user interface system |
US11231834B2 (en) | 2020-06-03 | 2022-01-25 | Micron Technology, Inc. | Vehicle having an intelligent user interface |
CN116360725B (en) * | 2020-07-21 | 2024-02-23 | 华为技术有限公司 | Display interaction system, display method and device |
KR20220012599A (en) * | 2020-07-23 | 2022-02-04 | 삼성전자주식회사 | Apparatus and method for providing content search using keypad in electronic device |
US11789696B2 (en) * | 2021-03-23 | 2023-10-17 | Microsoft Technology Licensing, Llc | Voice assistant-enabled client application with user view context |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US6023714A (en) * | 1997-04-24 | 2000-02-08 | Microsoft Corporation | Method and system for dynamically adapting the layout of a document to an output device |
US6433801B1 (en) * | 1997-09-26 | 2002-08-13 | Ericsson Inc. | Method and apparatus for using a touch screen display on a portable intelligent communications device |
US20110107227A1 (en) * | 2008-04-07 | 2011-05-05 | Express Mobile Inc. | Systems and methods for presenting information on mobile devices |
US20110214077A1 (en) * | 2007-09-21 | 2011-09-01 | Adobe Systems Incorporated | Dynamic user interface elements |
US20120017172A1 (en) * | 2010-07-15 | 2012-01-19 | Microsoft Corporation | Display-agnostic user interface for mobile devices |
US20130019150A1 (en) * | 2011-07-13 | 2013-01-17 | Rony Zarom | System and method for automatic and dynamic layout design for media broadcast |
US20130159417A1 (en) * | 2011-12-19 | 2013-06-20 | France Telecom | Method for notification of events on a device running multiple user identities |
US20130159917A1 (en) * | 2011-12-20 | 2013-06-20 | Lenovo (Singapore) Pte. Ltd. | Dynamic user interface based on connected devices |
US20130174066A1 (en) * | 2010-07-02 | 2013-07-04 | Markus Felix | User interface system for operating machines |
US20130174047A1 (en) * | 2011-10-14 | 2013-07-04 | StarMobile, Inc. | View virtualization and transformations for mobile applications |
US20130212487A1 (en) * | 2012-01-09 | 2013-08-15 | Visa International Service Association | Dynamic Page Content and Layouts Apparatuses, Methods and Systems |
US20130212535A1 (en) * | 2012-02-13 | 2013-08-15 | Samsung Electronics Co., Ltd. | Tablet having user interface |
US20140013271A1 (en) * | 2012-07-05 | 2014-01-09 | Research In Motion Limited | Prioritization of multitasking applications in a mobile device interface |
US20140055495A1 (en) * | 2012-08-22 | 2014-02-27 | Lg Cns Co., Ltd. | Responsive user interface engine for display devices |
US20140208197A1 (en) * | 2013-01-23 | 2014-07-24 | Go Daddy Operating Company, LLC | Method for conversion of website content |
US20140282055A1 (en) * | 2013-03-15 | 2014-09-18 | Agilent Technologies, Inc. | Layout System for Devices with Variable Display Screen Sizes and Orientations |
US20140282254A1 (en) * | 2013-03-15 | 2014-09-18 | Microsoft Corporation | In-place contextual menu for handling actions for a listing of items |
US20140325345A1 (en) * | 2013-04-26 | 2014-10-30 | Amazon Technologies, Inc. | Consistent Scaling of Web-Based Content Across Devices Having Different Screen Metrics |
US20140340591A1 (en) * | 2013-05-17 | 2014-11-20 | Global Lighting Technologies Inc. | Multifunction input device |
US20150095767A1 (en) * | 2013-10-02 | 2015-04-02 | Rachel Ebner | Automatic generation of mobile site layouts |
US20150169197A1 (en) * | 2013-12-18 | 2015-06-18 | Konica Minolta Inc. | Screen generation device, remote operation device, remote control device, screen generation method, and screen generation program |
US20150277726A1 (en) * | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Sliding surface |
US20180004544A1 (en) * | 2016-06-30 | 2018-01-04 | Sap Se | Personalized run time user interfaces |
Family Cites Families (184)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2693810B1 (en) | 1991-06-03 | 1997-01-10 | Apple Computer | USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA. |
US5371844A (en) * | 1992-03-20 | 1994-12-06 | International Business Machines Corporation | Palette manager in a graphical user interface computer system |
US5420605A (en) | 1993-02-26 | 1995-05-30 | Binar Graphics, Inc. | Method of resetting a computer video display mode |
US5499334A (en) | 1993-03-01 | 1996-03-12 | Microsoft Corporation | Method and system for displaying window configuration of inactive programs |
US5666498A (en) * | 1996-03-29 | 1997-09-09 | International Business Machines Corporation | Method, memory and apparatus for automatically resizing a window |
US5920315A (en) * | 1996-07-17 | 1999-07-06 | International Business Machines Corporation | Multi-pane window with recoiling workspaces |
US5796401A (en) | 1996-08-09 | 1998-08-18 | Winer; Peter W. | System for designing dynamic layouts adaptable to various display screen sizes and resolutions |
US5760772A (en) * | 1996-08-30 | 1998-06-02 | Novell, Inc. | Method for automatically resizing a child window |
US5886694A (en) | 1997-07-14 | 1999-03-23 | Microsoft Corporation | Method for automatically laying out controls in a dialog window |
US6300947B1 (en) * | 1998-07-06 | 2001-10-09 | International Business Machines Corporation | Display screen and window size related web page adaptation system |
US6335743B1 (en) | 1998-08-11 | 2002-01-01 | International Business Machines Corporation | Method and system for providing a resize layout allowing flexible placement and sizing of controls |
US6342907B1 (en) | 1998-10-19 | 2002-01-29 | International Business Machines Corporation | Specification language for defining user interface panels that are platform-independent |
US6392836B1 (en) | 1999-01-15 | 2002-05-21 | Seagate Removable Storage Solutions Llc | Tape cartridge-loading mechanism |
US8600437B2 (en) * | 1999-04-07 | 2013-12-03 | Khyber Technologies Corporation | Portable computing, communication and entertainment device with central processor carried in a detachable portable device |
US6538665B2 (en) | 1999-04-15 | 2003-03-25 | Apple Computer, Inc. | User interface for presenting media information |
US7624356B1 (en) * | 2000-06-21 | 2009-11-24 | Microsoft Corporation | Task-sensitive methods and systems for displaying command sets |
GB0019459D0 (en) | 2000-07-28 | 2000-09-27 | Symbian Ltd | Computing device with improved user interface for applications |
US6734882B1 (en) | 2000-09-29 | 2004-05-11 | Apple Computer, Inc. | Combined menu-list control element in a graphical user interface |
US6640655B1 (en) | 2000-10-03 | 2003-11-04 | Varco I/P, Inc. | Self tracking sensor suspension mechanism |
US6978473B1 (en) * | 2000-10-27 | 2005-12-20 | Sony Corporation | Pop-up option palette |
US7028306B2 (en) | 2000-12-04 | 2006-04-11 | International Business Machines Corporation | Systems and methods for implementing modular DOM (Document Object Model)-based multi-modal browsers |
US7493568B2 (en) * | 2001-01-26 | 2009-02-17 | Microsoft Corporation | System and method for browsing properties of an electronic document |
US6791581B2 (en) | 2001-01-31 | 2004-09-14 | Microsoft Corporation | Methods and systems for synchronizing skin properties |
US7155681B2 (en) | 2001-02-14 | 2006-12-26 | Sproqit Technologies, Inc. | Platform-independent distributed user interface server architecture |
GB0105994D0 (en) * | 2001-03-10 | 2001-05-02 | Pace Micro Tech Plc | Video display resizing |
WO2002101534A1 (en) * | 2001-06-12 | 2002-12-19 | Idelix Software Inc. | Graphical user interface with zoom for detail-in-context presentations |
WO2003005186A1 (en) | 2001-07-05 | 2003-01-16 | Fujitsu Limited | Start up of application on information processor by means of portable unit |
US6950993B2 (en) | 2001-08-02 | 2005-09-27 | Microsoft Corporation | System and method for automatic and dynamic layout of resizable dialog type windows |
US20130024778A1 (en) | 2011-07-13 | 2013-01-24 | Z124 | Dynamic cross-environment application configuration/orientation |
US8966379B2 (en) | 2010-10-01 | 2015-02-24 | Z124 | Dynamic cross-environment application configuration/orientation in an active user environment |
US7234111B2 (en) * | 2001-09-28 | 2007-06-19 | Ntt Docomo, Inc. | Dynamic adaptation of GUI presentations to heterogeneous device platforms |
US7392483B2 (en) | 2001-09-28 | 2008-06-24 | Ntt Docomo, Inc, | Transformation of platform specific graphical user interface widgets migrated between heterogeneous device platforms |
US20030063120A1 (en) | 2001-09-28 | 2003-04-03 | Wong Hoi Lee Candy | Scalable graphical user interface architecture |
US7895522B2 (en) | 2001-09-28 | 2011-02-22 | Ntt Docomo, Inc. | Layout of platform specific graphical user interface widgets migrated between heterogeneous device platforms |
US20050066037A1 (en) | 2002-04-10 | 2005-03-24 | Yu Song | Browser session mobility system for multi-platform applications |
US20080313282A1 (en) | 2002-09-10 | 2008-12-18 | Warila Bruce W | User interface, operating system and architecture |
US20040056894A1 (en) | 2002-09-19 | 2004-03-25 | Igor Zaika | System and method for describing and instantiating extensible user interfaces |
US7574669B1 (en) * | 2002-10-08 | 2009-08-11 | Microsoft Corporation | User interface control for navigating, selecting, and organizing document pages |
US20040075693A1 (en) | 2002-10-21 | 2004-04-22 | Moyer Timothy A. | Compact method of navigating hierarchical menus on an electronic device having a small display screen |
US20040153973A1 (en) * | 2002-11-21 | 2004-08-05 | Lawrence Horwitz | System and method for automatically storing and recalling application states based on application contexts |
US8418081B2 (en) * | 2002-12-18 | 2013-04-09 | International Business Machines Corporation | Optimizing display space with expandable and collapsible user interface controls |
US20040223004A1 (en) | 2003-05-05 | 2004-11-11 | Lincke Scott D. | System and method for implementing a landscape user experience in a hand-held computing device |
US7308288B2 (en) | 2003-08-22 | 2007-12-11 | Sbc Knowledge Ventures, Lp. | System and method for prioritized interface design |
US7395500B2 (en) * | 2003-08-29 | 2008-07-01 | Yahoo! Inc. | Space-optimizing content display |
US20050055645A1 (en) * | 2003-09-09 | 2005-03-10 | Mitutoyo Corporation | System and method for resizing tiles on a computer display |
KR20060069497A (en) | 2003-09-24 | 2006-06-21 | 노키아 코포레이션 | Improved presentation of large objects on small displays |
US7418670B2 (en) | 2003-10-03 | 2008-08-26 | Microsoft Corporation | Hierarchical in-place menus |
US7992103B2 (en) * | 2004-04-26 | 2011-08-02 | Microsoft Corporation | Scaling icons for representing files |
US7565623B2 (en) * | 2004-04-30 | 2009-07-21 | Microsoft Corporation | System and method for selecting a view mode and setting |
US8302020B2 (en) * | 2004-06-25 | 2012-10-30 | Apple Inc. | Widget authoring and editing environment |
US7895531B2 (en) | 2004-08-16 | 2011-02-22 | Microsoft Corporation | Floating command object |
US8255828B2 (en) | 2004-08-16 | 2012-08-28 | Microsoft Corporation | Command user interface for displaying selectable software functionality controls |
US20060082518A1 (en) * | 2004-10-19 | 2006-04-20 | Pranil Ram | Multiple monitor display apparatus |
US8169410B2 (en) | 2004-10-20 | 2012-05-01 | Nintendo Co., Ltd. | Gesture inputs for a portable display device |
US7812786B2 (en) | 2005-01-18 | 2010-10-12 | Nokia Corporation | User interface for different displays |
US7752633B1 (en) | 2005-03-14 | 2010-07-06 | Seven Networks, Inc. | Cross-platform event engine |
US7512904B2 (en) | 2005-03-22 | 2009-03-31 | Microsoft Corporation | Operating system launch menu program listing |
US9043719B2 (en) * | 2005-04-08 | 2015-05-26 | New York Stock Exchange Llc | System and method for managing and displaying securities market information |
US20060236264A1 (en) | 2005-04-18 | 2006-10-19 | Microsoft Corporation | Automatic window resize behavior and optimizations |
US7432928B2 (en) * | 2005-06-14 | 2008-10-07 | Microsoft Corporation | User interface state reconfiguration through animation |
US8392836B1 (en) | 2005-07-11 | 2013-03-05 | Google Inc. | Presenting quick list of contacts to communication application user |
US8689137B2 (en) | 2005-09-07 | 2014-04-01 | Microsoft Corporation | Command user interface for displaying selectable functionality controls in a database application |
US7673233B2 (en) * | 2005-09-08 | 2010-03-02 | Microsoft Corporation | Browser tab management |
US7735018B2 (en) * | 2005-09-13 | 2010-06-08 | Spacetime3D, Inc. | System and method for providing three-dimensional graphical user interface |
US8904286B2 (en) | 2006-02-13 | 2014-12-02 | Blackberry Limited | Method and arrangement for providing a primary actions menu on a wireless handheld communication device |
US8635553B2 (en) * | 2006-02-16 | 2014-01-21 | Adobe Systems Incorporated | Auto adjustable pane view |
US20090278806A1 (en) | 2008-05-06 | 2009-11-12 | Matias Gonzalo Duarte | Extended touch-sensitive control area for electronic device |
US20070266335A1 (en) | 2006-05-12 | 2007-11-15 | Microsoft Corporation | Providing a standard user interface (UI) across disparate display interfaces |
KR100825871B1 (en) | 2006-06-28 | 2008-04-28 | 삼성전자주식회사 | Method and Apparatus for providing User Interface in a Terminal having Touch Pad |
US7768605B2 (en) * | 2006-06-30 | 2010-08-03 | Motorola, Inc. | Display stack-up for a mobile electronic device having internal and external displays |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US7934156B2 (en) | 2006-09-06 | 2011-04-26 | Apple Inc. | Deletion gestures on a portable multifunction device |
US20080163112A1 (en) | 2006-12-29 | 2008-07-03 | Research In Motion Limited | Designation of menu actions for applications on a handheld electronic device |
US8108763B2 (en) * | 2007-01-19 | 2012-01-31 | Constant Contact, Inc. | Visual editor for electronic mail |
KR101494126B1 (en) * | 2007-01-31 | 2015-02-16 | 에이저 시스템즈 엘엘시 | Handheld device with multiple displays |
US8752011B2 (en) | 2007-03-20 | 2014-06-10 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for automatically generating customizable user interfaces using programming patterns |
AR067297A1 (en) | 2007-03-28 | 2009-10-07 | Avery Dennison Corp | TAPE TYPE USER INTERFACE FOR AN APPLICATION PROGRAM |
US8276069B2 (en) | 2007-03-28 | 2012-09-25 | Honeywell International Inc. | Method and system for automatically generating an adaptive user interface for a physical environment |
WO2008148012A1 (en) * | 2007-05-25 | 2008-12-04 | Veveo, Inc. | System and method for text disambiguation and context designation in incremental search |
US8059101B2 (en) | 2007-06-22 | 2011-11-15 | Apple Inc. | Swipe gestures for touch screen keyboards |
US8478245B2 (en) | 2007-08-01 | 2013-07-02 | Phunware, Inc. | Method and system for rendering content on a wireless device |
US7949954B1 (en) | 2007-08-17 | 2011-05-24 | Trading Technologies International, Inc. | Dynamic functionality based on window characteristics |
KR101445603B1 (en) * | 2007-08-27 | 2014-09-29 | 삼성전자주식회사 | Adaptive video processing apparatus and video scaling method based on screen size of display device |
US20090192849A1 (en) | 2007-11-09 | 2009-07-30 | Hughes John M | System and method for software development |
US8078979B2 (en) | 2007-11-27 | 2011-12-13 | Microsoft Corporation | Web page editor with element selection mechanism |
US20090140977A1 (en) | 2007-11-30 | 2009-06-04 | Microsoft Corporation | Common User Interface Structure |
JP4364273B2 (en) | 2007-12-28 | 2009-11-11 | パナソニック株式会社 | Portable terminal device, display control method, and display control program |
US9003315B2 (en) * | 2008-04-01 | 2015-04-07 | Litl Llc | System and method for streamlining user interaction with electronic content |
US8085265B2 (en) | 2008-04-23 | 2011-12-27 | Honeywell International Inc. | Methods and systems of generating 3D user interface for physical environment |
KR101461954B1 (en) | 2008-05-08 | 2014-11-14 | 엘지전자 주식회사 | Terminal and method for controlling the same |
US7930343B2 (en) | 2008-05-16 | 2011-04-19 | Honeywell International Inc. | Scalable user interface system |
US20100138780A1 (en) * | 2008-05-20 | 2010-06-03 | Adam Marano | Methods and systems for using external display devices with a mobile computing device |
TW201001267A (en) | 2008-06-20 | 2010-01-01 | Amtran Technology Co Ltd | Electronic apparatus with screen displayed menu and its generation method |
US10631632B2 (en) * | 2008-10-13 | 2020-04-28 | Steelcase Inc. | Egalitarian control apparatus and method for sharing information in a collaborative workspace |
US20100122215A1 (en) | 2008-11-11 | 2010-05-13 | Qwebl, Inc. | Control interface for home automation system |
US8302026B2 (en) | 2008-11-28 | 2012-10-30 | Microsoft Corporation | Multi-panel user interface |
US8638311B2 (en) | 2008-12-08 | 2014-01-28 | Samsung Electronics Co., Ltd. | Display device and data displaying method thereof |
US8274536B2 (en) | 2009-03-16 | 2012-09-25 | Apple Inc. | Smart keyboard management for a multifunction device with a touch screen display |
US20160320938A9 (en) * | 2009-03-17 | 2016-11-03 | Litera Technologies, LLC | System and Method for the Auto-Detection and Presentation of Pre-Set Configurations for Multiple Monitor Layout Display |
EP2237140B1 (en) | 2009-03-31 | 2018-12-26 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9335916B2 (en) | 2009-04-15 | 2016-05-10 | International Business Machines Corporation | Presenting and zooming a set of objects within a window |
US9298336B2 (en) | 2009-05-28 | 2016-03-29 | Apple Inc. | Rotation smoothing of a user interface |
US8806331B2 (en) * | 2009-07-20 | 2014-08-12 | Interactive Memories, Inc. | System and methods for creating and editing photo-based projects on a digital network |
CN101996018A (en) | 2009-08-17 | 2011-03-30 | 张学志 | Novel vertical ribbon graphic user interface |
US9465786B2 (en) | 2009-08-25 | 2016-10-11 | Keeper Security, Inc. | Method for facilitating quick logins from a mobile device |
CA2717553C (en) | 2009-10-13 | 2015-06-30 | Research In Motion Limited | User interface for a touchscreen display |
US8490018B2 (en) | 2009-11-17 | 2013-07-16 | International Business Machines Corporation | Prioritization of choices based on context and user history |
US8627230B2 (en) | 2009-11-24 | 2014-01-07 | International Business Machines Corporation | Intelligent command prediction |
WO2011090467A1 (en) | 2009-12-28 | 2011-07-28 | Hillcrest Laboratories Inc. | Tv internet browser |
US9052894B2 (en) | 2010-01-15 | 2015-06-09 | Apple Inc. | API to replace a keyboard with custom controls |
WO2011108797A1 (en) | 2010-03-03 | 2011-09-09 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US8799325B2 (en) | 2010-03-12 | 2014-08-05 | Microsoft Corporation | Reordering nodes in a hierarchical structure |
US20110242750A1 (en) * | 2010-04-01 | 2011-10-06 | Oakley Nicholas W | Accessible display in device with closed lid |
GB2479756B (en) | 2010-04-21 | 2013-06-05 | Realvnc Ltd | Virtual interface devices |
US8631350B2 (en) | 2010-04-23 | 2014-01-14 | Blackberry Limited | Graphical context short menu |
WO2011153623A2 (en) | 2010-06-08 | 2011-12-15 | Aastra Technologies Limited | Method and system for video communication |
US20110307804A1 (en) | 2010-06-11 | 2011-12-15 | Spierer Mitchell D | Electronic message management system and method |
US9483175B2 (en) * | 2010-07-26 | 2016-11-01 | Apple Inc. | Device, method, and graphical user interface for navigating through a hierarchy |
US20120030584A1 (en) * | 2010-07-30 | 2012-02-02 | Brian Bian | Method and apparatus for dynamically switching between scalable graphical user interfaces for mobile devices |
US9465457B2 (en) | 2010-08-30 | 2016-10-11 | Vmware, Inc. | Multi-touch interface gestures for keyboard and/or mouse inputs |
US8754860B2 (en) | 2010-11-05 | 2014-06-17 | Apple Inc. | Device, method, and graphical user interface for manipulating soft keyboards |
US9336117B2 (en) * | 2010-11-09 | 2016-05-10 | Vmware, Inc. | Remote display performance measurement triggered by application display upgrade |
US9208477B2 (en) * | 2010-11-17 | 2015-12-08 | Z124 | Email client mode transitions in a smartpad device |
KR20130135282A (en) * | 2010-12-10 | 2013-12-10 | 요타 디바이시스 아이피알 리미티드 | Mobile device with user interface |
CA2833034A1 (en) | 2011-04-13 | 2012-10-18 | Blackberry Limited | System and method for context aware dynamic ribbon |
US20120287114A1 (en) * | 2011-05-11 | 2012-11-15 | Microsoft Corporation | Interface including views positioned in along multiple dimensions |
CN102866907B (en) * | 2011-07-06 | 2015-11-25 | 腾讯科技(深圳)有限公司 | Desktop switching method and device |
US9582187B2 (en) | 2011-07-14 | 2017-02-28 | Microsoft Technology Licensing, Llc | Dynamic context based menus |
US9086794B2 (en) | 2011-07-14 | 2015-07-21 | Microsoft Technology Licensing, Llc | Determining gestures on context based menus |
US8707289B2 (en) * | 2011-07-20 | 2014-04-22 | Google Inc. | Multiple application versions |
US20130036443A1 (en) | 2011-08-03 | 2013-02-07 | Verizon Patent And Licensing Inc. | Interactive and program half-screen |
KR101862123B1 (en) | 2011-08-31 | 2018-05-30 | 삼성전자 주식회사 | Input device and method on terminal equipment having a touch module |
US8909298B2 (en) | 2011-09-30 | 2014-12-09 | Samsung Electronics Co., Ltd. | Apparatus and method for mobile screen navigation |
US9360998B2 (en) | 2011-11-01 | 2016-06-07 | Paypal, Inc. | Selection and organization based on selection of X-Y position |
US9141280B2 (en) | 2011-11-09 | 2015-09-22 | Blackberry Limited | Touch-sensitive display method and apparatus |
US8881032B1 (en) * | 2011-12-07 | 2014-11-04 | Google Inc. | Grouped tab document interface |
KR20130064478A (en) | 2011-12-08 | 2013-06-18 | 삼성전자주식회사 | User terminal device and method for displaying background screen thereof |
CA2860569A1 (en) | 2012-01-09 | 2013-07-18 | Airbiquity Inc. | User interface for mobile device |
US20130191781A1 (en) | 2012-01-20 | 2013-07-25 | Microsoft Corporation | Displaying and interacting with touch contextual user interface |
EP2631760A1 (en) | 2012-02-24 | 2013-08-28 | Research In Motion Limited | Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content |
EP2631737A1 (en) | 2012-02-24 | 2013-08-28 | Research In Motion Limited | Method and apparatus for providing a contextual user interface on a device |
US20130227490A1 (en) | 2012-02-24 | 2013-08-29 | Simon Martin THORSANDER | Method and Apparatus for Providing an Option to Enable Multiple Selections |
EP2631747B1 (en) | 2012-02-24 | 2016-03-30 | BlackBerry Limited | Method and apparatus for providing a user interface on a device that indicates content operators |
EP2631761A1 (en) | 2012-02-24 | 2013-08-28 | Research In Motion Limited | Method and apparatus for providing an option to undo a delete operation |
US9081498B2 (en) | 2012-02-24 | 2015-07-14 | Blackberry Limited | Method and apparatus for adjusting a user interface to reduce obscuration |
DE102012005054A1 (en) | 2012-03-15 | 2013-09-19 | Volkswagen Aktiengesellschaft | Method, mobile device and infotainment system for projecting a user interface on a screen |
US10673691B2 (en) | 2012-03-24 | 2020-06-02 | Fred Khosropour | User interaction platform |
US10229222B2 (en) * | 2012-03-26 | 2019-03-12 | Greyheller, Llc | Dynamically optimized content display |
US9146655B2 (en) | 2012-04-06 | 2015-09-29 | Samsung Electronics Co., Ltd. | Method and device for executing object on display |
US9021371B2 (en) | 2012-04-20 | 2015-04-28 | Logitech Europe S.A. | Customizing a user interface having a plurality of top-level icons based on a change in context |
US8937636B2 (en) | 2012-04-20 | 2015-01-20 | Logitech Europe S.A. | Using previous selection information in a user interface having a plurality of icons |
US20130285926A1 (en) * | 2012-04-30 | 2013-10-31 | Research In Motion Limited | Configurable Touchscreen Keyboard |
US20140189586A1 (en) | 2012-12-28 | 2014-07-03 | Spritz Technology Llc | Methods and systems for displaying text using rsvp |
US9256351B2 (en) | 2012-07-20 | 2016-02-09 | Blackberry Limited | Method and electronic device for facilitating user control of a menu |
US20140033110A1 (en) | 2012-07-26 | 2014-01-30 | Texas Instruments Incorporated | Accessing Secondary Functions on Soft Keyboards Using Gestures |
KR20140023534A (en) * | 2012-08-16 | 2014-02-27 | 에스케이플래닛 주식회사 | Apparatus for implementing responsive user interface, method thereof and recordable medium storing the method |
US9329778B2 (en) | 2012-09-07 | 2016-05-03 | International Business Machines Corporation | Supplementing a virtual input keyboard |
US9654426B2 (en) | 2012-11-20 | 2017-05-16 | Dropbox, Inc. | System and method for organizing messages |
US9729695B2 (en) | 2012-11-20 | 2017-08-08 | Dropbox Inc. | Messaging client application interface |
US9588674B2 (en) | 2012-11-30 | 2017-03-07 | Qualcomm Incorporated | Methods and systems for providing an automated split-screen user interface on a device |
DE112013000403T5 (en) * | 2012-12-28 | 2014-11-20 | Intel Corporation | Computer with dual configuration |
US9160915B1 (en) * | 2013-01-09 | 2015-10-13 | Amazon Technologies, Inc. | Modifying device functionality based on device orientation |
US9652109B2 (en) | 2013-01-11 | 2017-05-16 | Microsoft Technology Licensing, Llc | Predictive contextual toolbar for productivity applications |
WO2014117241A1 (en) | 2013-02-04 | 2014-08-07 | 602531 British Columbia Ltd. | Data retrieval by way of context-sensitive icons |
US10025459B2 (en) | 2013-03-14 | 2018-07-17 | Airwatch Llc | Gesture-based workflow progression |
EP2972813A1 (en) | 2013-03-15 | 2016-01-20 | Beeonics Inc. | Dynamic user interface delivery system |
US20140282178A1 (en) | 2013-03-15 | 2014-09-18 | Microsoft Corporation | Personalized community model for surfacing commands within productivity application user interfaces |
US9304665B2 (en) | 2013-04-05 | 2016-04-05 | Yahoo! Inc. | Method and apparatus for facilitating message selection and organization |
US10249018B2 (en) | 2013-04-25 | 2019-04-02 | Nvidia Corporation | Graphics processor and method of scaling user interface elements for smaller displays |
US9501500B2 (en) | 2013-05-10 | 2016-11-22 | Tencent Technology (Shenzhen) Company Limited | Systems and methods for image file processing |
US20150033188A1 (en) | 2013-07-23 | 2015-01-29 | Microsoft Corporation | Scrollable smart menu |
EP2843542A1 (en) * | 2013-08-29 | 2015-03-04 | Samsung Electronics Co., Ltd | User terminal apparatus, method for controlling user terminal apparatus, and expanded display system |
US9311422B2 (en) | 2013-09-12 | 2016-04-12 | Adobe Systems Incorporated | Dynamic simulation of a responsive web page |
US9519401B2 (en) | 2013-09-18 | 2016-12-13 | Adobe Systems Incorporated | Providing context menu based on predicted commands |
US9772755B2 (en) * | 2013-11-15 | 2017-09-26 | Microsoft Technology Licensing, Llc | Remote control for displaying application data on dissimilar screens |
US9507520B2 (en) | 2013-12-16 | 2016-11-29 | Microsoft Technology Licensing, Llc | Touch-based reorganization of page element |
KR20150099324A (en) | 2014-02-21 | 2015-08-31 | 삼성전자주식회사 | Method for romote control between electronic devices and system therefor |
US20150277682A1 (en) * | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Scalable user interface display |
US9658741B2 (en) | 2014-04-25 | 2017-05-23 | Rohde & Schwarz Gmbh & Co. Kg | Measuring device and measuring method with interactive operation |
US20160132301A1 (en) | 2014-11-06 | 2016-05-12 | Microsoft Technology Licensing, Llc | Programmatic user interface generation based on display size |
US10949075B2 (en) | 2014-11-06 | 2021-03-16 | Microsoft Technology Licensing, Llc | Application command control for small screen display |
US20160209973A1 (en) * | 2015-01-21 | 2016-07-21 | Microsoft Technology Licensing, Llc. | Application user interface reconfiguration based on an experience mode transition |
US10042655B2 (en) * | 2015-01-21 | 2018-08-07 | Microsoft Technology Licensing, Llc. | Adaptable user interface display |
US9805003B2 (en) * | 2015-04-02 | 2017-10-31 | Apple Inc. | Rearranging layouts for different displays |
-
2015
- 2015-06-01 US US14/727,226 patent/US20160132301A1/en not_active Abandoned
- 2015-06-01 US US14/726,868 patent/US20160132992A1/en not_active Abandoned
- 2015-08-31 US US14/840,360 patent/US11126329B2/en active Active
- 2015-10-12 US US14/880,768 patent/US11422681B2/en active Active
-
2016
- 2016-09-30 WO PCT/US2016/054572 patent/WO2017065988A1/en active Application Filing
Patent Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6023714A (en) * | 1997-04-24 | 2000-02-08 | Microsoft Corporation | Method and system for dynamically adapting the layout of a document to an output device |
US6433801B1 (en) * | 1997-09-26 | 2002-08-13 | Ericsson Inc. | Method and apparatus for using a touch screen display on a portable intelligent communications device |
US6018346A (en) * | 1998-01-12 | 2000-01-25 | Xerox Corporation | Freeform graphics system having meeting objects for supporting meeting objectives |
US20110214077A1 (en) * | 2007-09-21 | 2011-09-01 | Adobe Systems Incorporated | Dynamic user interface elements |
US20110107227A1 (en) * | 2008-04-07 | 2011-05-05 | Express Mobile Inc. | Systems and methods for presenting information on mobile devices |
US20130174066A1 (en) * | 2010-07-02 | 2013-07-04 | Markus Felix | User interface system for operating machines |
US20120017172A1 (en) * | 2010-07-15 | 2012-01-19 | Microsoft Corporation | Display-agnostic user interface for mobile devices |
US20130019150A1 (en) * | 2011-07-13 | 2013-01-17 | Rony Zarom | System and method for automatic and dynamic layout design for media broadcast |
US20130174047A1 (en) * | 2011-10-14 | 2013-07-04 | StarMobile, Inc. | View virtualization and transformations for mobile applications |
US20130159417A1 (en) * | 2011-12-19 | 2013-06-20 | France Telecom | Method for notification of events on a device running multiple user identities |
US20130159917A1 (en) * | 2011-12-20 | 2013-06-20 | Lenovo (Singapore) Pte. Ltd. | Dynamic user interface based on connected devices |
US20130212487A1 (en) * | 2012-01-09 | 2013-08-15 | Visa International Service Association | Dynamic Page Content and Layouts Apparatuses, Methods and Systems |
US20130212535A1 (en) * | 2012-02-13 | 2013-08-15 | Samsung Electronics Co., Ltd. | Tablet having user interface |
US20140013271A1 (en) * | 2012-07-05 | 2014-01-09 | Research In Motion Limited | Prioritization of multitasking applications in a mobile device interface |
US20140055495A1 (en) * | 2012-08-22 | 2014-02-27 | Lg Cns Co., Ltd. | Responsive user interface engine for display devices |
US20140208197A1 (en) * | 2013-01-23 | 2014-07-24 | Go Daddy Operating Company, LLC | Method for conversion of website content |
US20140282055A1 (en) * | 2013-03-15 | 2014-09-18 | Agilent Technologies, Inc. | Layout System for Devices with Variable Display Screen Sizes and Orientations |
US20140282254A1 (en) * | 2013-03-15 | 2014-09-18 | Microsoft Corporation | In-place contextual menu for handling actions for a listing of items |
US20140325345A1 (en) * | 2013-04-26 | 2014-10-30 | Amazon Technologies, Inc. | Consistent Scaling of Web-Based Content Across Devices Having Different Screen Metrics |
US20140340591A1 (en) * | 2013-05-17 | 2014-11-20 | Global Lighting Technologies Inc. | Multifunction input device |
US20150095767A1 (en) * | 2013-10-02 | 2015-04-02 | Rachel Ebner | Automatic generation of mobile site layouts |
US20150169197A1 (en) * | 2013-12-18 | 2015-06-18 | Konica Minolta Inc. | Screen generation device, remote operation device, remote control device, screen generation method, and screen generation program |
US20150277726A1 (en) * | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Sliding surface |
US20180004544A1 (en) * | 2016-06-30 | 2018-01-04 | Sap Se | Personalized run time user interfaces |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10725632B2 (en) | 2013-03-15 | 2020-07-28 | Microsoft Technology Licensing, Llc | In-place contextual menu for handling actions for a listing of items |
US11422681B2 (en) | 2014-11-06 | 2022-08-23 | Microsoft Technology Licensing, Llc | User interface for application command control |
US11126329B2 (en) | 2014-11-06 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application command control for smaller screen display |
US10949075B2 (en) | 2014-11-06 | 2021-03-16 | Microsoft Technology Licensing, Llc | Application command control for small screen display |
USD801995S1 (en) * | 2015-03-06 | 2017-11-07 | Samsung Electronics Co., Ltd | Display screen or portion thereof with graphical user interface |
US10455056B2 (en) * | 2015-08-21 | 2019-10-22 | Abobe Inc. | Cloud-based storage and interchange mechanism for design elements |
US20170054831A1 (en) * | 2015-08-21 | 2017-02-23 | Adobe Systems Incorporated | Cloud-based storage and interchange mechanism for design elements |
US10496241B2 (en) | 2015-08-21 | 2019-12-03 | Adobe Inc. | Cloud-based inter-application interchange of style information |
US20170357424A1 (en) * | 2016-06-10 | 2017-12-14 | Apple Inc. | Editing inherited configurations |
US10496419B2 (en) * | 2016-06-10 | 2019-12-03 | Apple Inc. | Editing inherited configurations |
US11928417B2 (en) * | 2016-06-10 | 2024-03-12 | Truecontext Inc. | Flexible online form display |
US11816459B2 (en) * | 2016-11-16 | 2023-11-14 | Native Ui, Inc. | Graphical user interface programming system |
US10228835B2 (en) * | 2016-12-23 | 2019-03-12 | Beijing Kingsoft Internet Security Software Co., Ltd. | Method for displaying information, and terminal equipment |
US10877635B2 (en) | 2017-05-10 | 2020-12-29 | Embee Mobile, Inc. | System and method for the capture of mobile behavior, usage, or content exposure |
US11095733B2 (en) | 2017-05-10 | 2021-08-17 | Embee Mobile, Inc. | System and method for the capture of mobile behavior, usage, or content exposure based on changes in UI layout |
US11924296B2 (en) | 2017-05-10 | 2024-03-05 | Embee Mobile, Inc. | System and method for the capture of mobile behavior, usage, or content exposure |
US11809217B2 (en) * | 2017-06-16 | 2023-11-07 | Microsoft Technology Licensing, Llc | Rules based user interface generation |
US20190087389A1 (en) * | 2017-09-18 | 2019-03-21 | Elutions IP Holdings S.à.r.l. | Systems and methods for configuring display layout |
US20200133644A1 (en) * | 2018-10-31 | 2020-04-30 | Salesforce.Com, Inc. | Automatic Classification of User Interface Elements |
US10949174B2 (en) * | 2018-10-31 | 2021-03-16 | Salesforce.Com, Inc. | Automatic classification of user interface elements |
Also Published As
Publication number | Publication date |
---|---|
US20160132234A1 (en) | 2016-05-12 |
US11126329B2 (en) | 2021-09-21 |
US20160132195A1 (en) | 2016-05-12 |
WO2017065988A1 (en) | 2017-04-20 |
US11422681B2 (en) | 2022-08-23 |
US20160132992A1 (en) | 2016-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160132301A1 (en) | Programmatic user interface generation based on display size | |
US10949075B2 (en) | Application command control for small screen display | |
EP3126967B1 (en) | Adaptive user interface pane manager | |
US20110258534A1 (en) | Declarative definition of complex user interface state changes | |
US20160209994A1 (en) | Adaptable user interface display | |
CN107209628B (en) | Adaptive user interface pane object | |
US10248439B2 (en) | Format object task pane | |
EP2810151B1 (en) | Extension activation for related documents | |
US20140351796A1 (en) | Accessibility compliance testing using code injection | |
US20140372865A1 (en) | Interaction of Web Content with an Electronic Application Document | |
US20080086701A1 (en) | Method of displaying and editing properties of artifacts in graphical editors | |
US11487406B1 (en) | Windowing container |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RISCUTIA, VLAD;SETO, JULIE;NGUYEN, LUAN;AND OTHERS;SIGNING DATES FROM 20150526 TO 20150601;REEL/FRAME:035756/0890 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |