US20220230603A1 - Adaptable user interface with dual screen device - Google Patents
Adaptable user interface with dual screen device Download PDFInfo
- Publication number
- US20220230603A1 US20220230603A1 US17/716,576 US202217716576A US2022230603A1 US 20220230603 A1 US20220230603 A1 US 20220230603A1 US 202217716576 A US202217716576 A US 202217716576A US 2022230603 A1 US2022230603 A1 US 2022230603A1
- Authority
- US
- United States
- Prior art keywords
- display
- items
- presentation
- user
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1616—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- Typical laptops, tablets and similar devices have either a fixed keyboard or a user interface (UI) that presents information in a particular location all the time. For instance, icons, buttons and menu items may always be presented along the bottom of the screen of the device. And while there are some existing dual screen devices, the UIs of both screens are typically separate from one another.
- UI user interface
- aspects of the disclosure provide different inputs of the UI are presented to the user on one or both of the screens depending on what the user is doing. This makes applications running on the dual screen device more clear and explicit about the various features, and making it easier to use the applications. For instance, control elements such as buttons and menus can be brought up front to the closer (lower) screen for quick input, while active content (e.g., video chat, full image, drawings) is presented on the farther (upper) screen.
- active content e.g., video chat, full image, drawings
- a computer-implemented content presentation method for use in a multi-display computer system comprises generating, by one or more processors, a first set of content items for presentation on a first display device of the multi-display computer system; generating, by the one or more processors, a second set of content items for presentation on a second display device of the multi-display computer system; upon receiving input information, the one or more processing devices modifying at least one of the first and second sets of content items; and presenting the modified content items on at least one of the first and second display devices, wherein presentation of the modified content includes swapping selected content between the first and second display devices.
- modifying at least one of the first and second sets of content items is performed in response to an input received from a user of the multi-display computer system.
- the received input may indicate either: a change in state for a currently running application; or a change in relative physical position of one or both of the first and second display devices.
- swapping the selected content includes interchanging one or more user interface elements between the first and second display devices.
- swapping the selected content includes either (i) linearly scrolling the selected content between the first and second display devices or (ii) rotating the selected content in a clockwise or counterclockwise direction between the first and second display devices.
- at least one of the first and second sets of content items corresponds to one selected from the group consisting of: (i) a log-in interface; (ii) one or more applications; (iii) one or more widgets; (iv) thumbnail images; (v) audio content; and (vi) video content.
- the received input information identifies that: an input device of the multi-display computer system has been pressed; an icon presented on either the first or the second display device has been selected; or an orientation of the multi-display computer system has changed from a first orientation to a second orientation.
- the orientation change may indicate a physical rotation of the first and second display screens.
- the first set of content items are presented on the first display device and the second set of content items are presented on the second display device when the first display device is closer to the user than the second display device.
- the first set of content items includes one or more interactive elements configured to receive input from a user, and the second set of content items includes active content being presented to the user.
- a multi-display client computing device comprising a first display housing including a first display device therein, a second display housing including a second display device therein, one or more position and orientation sensors operatively coupled to at least one of the first and second display housings, and one or more processors operatively coupled to the first and second display devices and to the one or more position and orientation sensors.
- the one or more processors are configured to generate a first set of content items for presentation on the first display device; generate a second set of content items for presentation on the second display device; upon receiving input information from an input source, modify at least one of the first and second sets of content items; and cause the modified content items to be presented on at least one of the first and second display devices.
- Presentation of the modified content includes swapping selected content between the first and second display devices.
- each of the first and second display devices includes a touch sensitive input
- the input source is the touch sensitive input of one of the first and second display devices.
- swapping the selected content includes either (i) linearly scrolling the selected content between the first and second display devices or (ii) rotating the selected content in a clockwise or counterclockwise direction between the first and second display devices.
- the received input information identifies that an orientation of one or both of the first and second display housings has changed from a first orientation to a second orientation as detected by the one or more position and orientation sensors.
- the orientation change may indicate a physical rotation of at least one of the first and second display housings. And upon detection of the physical rotation, the modified content may be rotated in at least one of the first and second display screens.
- a computer-implemented content presentation method for use in a multi-display computer system comprises preparing, by one or more processors, selected content for presentation on a first display device of the multi-display computer system; generating, by the one or more processors, one or more control elements for presentation on a second display device of the multi-display computer system, the control elements being operatively associated with the selected content to enable manipulation or modification of the selected content in response to an input signal; and performing, by the one or more processors, a screen swapping operation to move the selected content from the first display device to the second display device, and to concurrently move the one or more control elements from the second display device to the first display device.
- the screen swapping operation is performed in response to either a received instruction or signal provided to the multi-display computer system, or physical rotation of one or both of the first and second display devices.
- a received instruction or signal provided to the multi-display computer system
- physical rotation of one or both of the first and second display devices.
- an orientation of one or both of the selected content and the one or more control elements may be rotated in a corresponding one of the first and second display devices.
- performing the screen swapping operation includes providing an appearance of either (i) linearly exchanging the selected content and the one or more control elements or (ii) rotating the selected content and the one or more control elements in a clockwise or counterclockwise direction.
- FIGS. 1A-B illustrate an example dual display screen client device according to aspects of the disclosure.
- FIGS. 2A-G illustrate an example log-in use according to aspects of the disclosure.
- FIGS. 3A-C illustrate an example application launch according to aspects of the disclosure.
- FIGS. 4A-4F illustrate examples of content searching and application switching and closing according to aspects of the disclosure.
- FIGS. 5A-5E illustrate operation of an example drawing application on a dual-screen arrangement according to aspects of the disclosure.
- FIGS. 6A-D illustrate a first example of display screen content switching in accordance with an exemplary embodiment.
- FIGS. 7A-G illustrate a second example of display screen content switching in accordance with an exemplary embodiment.
- FIGS. 8A-B illustrate operation of a dual screen display device in accordance with an exemplary embodiment.
- FIG. 9 is a flow diagram of an example method according to aspects of the disclosure.
- the technology relates to content presentation in a dual-screen type client device, such as a laptop, tablet, netbook or other type of portable client device.
- a dual-screen type client device such as a laptop, tablet, netbook or other type of portable client device.
- inputs for the UI may be presented one or both of the display screens depending on how a user is interacting with the client device and the type(s) of program(s) being used.
- Content or information switching between the screens is provided in a seamless manner, which provides the user with an enjoyable interactive experience that can benefit productivity.
- UI switching in which icons, content, control elements, etc., are swapped between the two physical display screens. This could be done, for example, by counterclockwise or clockwise rotation of the items, or up/down scrolling of the items, between two (stationary) screens.
- the other kind of switching is physical display screen switching. For instance, when the physical display screens are moved from 2 vertical screens to 2 horizontal screens, items on at least one of the two screens are also rotated for ease of interaction. This enables an application to provide the necessary tools to the user as he/she needs them.
- Such use cases include logging into an account, application or widget selection, enhanced image or web browsing, video conferencing, and more. These types of use cases are discussed in detail below. Before that, an example multi-screen system is presented.
- FIGS. 1A and 1B illustrate an example dual screen client device 100 .
- the device 100 includes a first display housing 102 a and a second display housing 102 b .
- Each housing includes a respective display device 104 a or 104 b .
- the display devices may be TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or OLED (Organic Light Emitting Diode) displays.
- each display housing 102 also includes respective user input(s) 106 a , 106 b , and display interfaces 108 a , 108 b .
- the user input(s) for each display housing may include a touch screen element such as a capacitive or resistive touch screen, as well as physical input buttons, keys, switches, dials, slides, a microphone, a mouse, a pen input, trackball, etc.
- a touch screen element such as a capacitive or resistive touch screen
- the system may audio and/or sensory (e.g., tactile) feedback.
- FIG. 1B Other components of the client device 100 are also shown in FIG. 1B . These include one or more computer processors 110 such as a central processing unit 112 and/or graphics processors 114 , as well as memory 116 configured to store instructions 118 and data 120 .
- the processors may or may not operate in parallel, and may include ASICs, controllers and other types of hardware circuitry.
- the processors are configured to receive information from a user through the user inputs 106 and user interface module 122 , and to present information to the user on the display devices 104 via the display interfaces 108 .
- User interface module 122 may receive commands from a user via the user inputs and convert them for submission to a given processor.
- Each display interface may comprise appropriate circuitry for driving the corresponding display device to present graphical and other information to the user.
- the graphical information may be generated by the graphics processors 114 , while CPU 112 manages overall operation of the client device 100 .
- Memory 116 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
- the memory 116 may include, for example, flash memory and/or NVRAM, and may be embodied as a hard-drive or memory card. Alternatively the memory 116 may also include DVD, CD-ROM, write-capable, and read-only memories.
- a computer program product is tangibly embodied in an information carrier.
- the computer program product contains instructions, such as instructions 118 that, when executed by one or more processors, perform one or more methods such as those described herein.
- the information carrier is a computer- or machine-readable medium, such as memory 116 .
- FIG. 1B functionally illustrates the processor(s), memory, and other elements of device 100 as being within the same overall block, such components may or may not be stored within the same physical housing.
- the data 120 may be retrieved, stored or modified by the processors in accordance with the instructions 118 .
- the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files.
- the data may also be formatted in any computing device-readable format.
- the instructions 118 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor(s).
- the instructions may be stored as computing device code on the computing device-readable medium.
- the terms “instructions” and “programs” may be used interchangeably herein.
- the instructions may be stored in object code format for direct processing by the processor(s), or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
- the dual-display computing device 100 includes one or more communication devices for communicating with other devices and systems.
- the communication devices include one or both of wireless transceiver 124 and/or wired transceiver 126 , which may provide a local area network (LAN) connection.
- the device 100 may communicate with other remote devices via these connections using various configurations and protocols, including short range communication protocols such as near-field communication, Bluetooth, Bluetooth LE, the Internet, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
- short range communication protocols such as near-field communication, Bluetooth, Bluetooth LE, the Internet, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
- the device 100 as shown includes one or more position and orientation sensors 128 .
- the position and orientation sensors 128 are configured to determine the position and orientation of client computing device 100 .
- these components may include a GPS receiver to determine the device's latitude, longitude and/or altitude as well as an accelerometer, gyroscope or another direction/speed detection device.
- Each display housing 102 may include its own set of position and orientation sensors 128 .
- the device 100 also includes one or more camera(s) 130 for capturing still images and recording video streams, speaker(s) 132 and a power module 134 .
- FIGS. 2A-G illustrate one scenario 200 for when a user logs onto the client device 100 or logs into a particular application.
- the user may open or otherwise access the device 100 , which provides an initial login 202 on display screen 104 b , which is the display screen closest to the user.
- the initial login 202 provides one or more options for the user to log in with. For instance, as shown, different user icons are presented.
- the user may select a particular user icon, for example by pressing the display screen with his or her finger.
- the selected icon representing the user's account begins “fly up” to display screen 104 a .
- the “fly up” process from one display screen to the other completes.
- FIG. 2E a login entry field or box 204 is presented on display screen 104 a , while a virtual keyboard or other account entry information 206 appears on display screen 104 b .
- the login may be accepted as shown in FIG. 2F .
- the virtual keyboard element may be replaced by icons or other elements of interest 208 on the display screen 104 b , while certain information items 210 appear on the display screen 104 a .
- FIG. 2F illustrates one example where the virtual keyboard appears to slide toward the user and out of the front of the device 100 as it fades away.
- the elements 208 and items 210 of FIG. 2G may be associated with various menu components of the current program or application that the user has logged onto.
- the user may access different submenus or other aspects of the program by selecting the elements 208 on one screen, while observing program status of other items of interest on the other screen.
- the icons or other elements of interest 208 such as control elements, are easily accessible to the user via a first, closer display screen, while content such as images, videos, etc., are presented on the second, viewing screen.
- FIGS. 3A-C illustrate another scenario 300 for launching an application or a widget.
- the user chooses one of the elements 208 on the closer display screen, e.g., display screen 104 b .
- This choice launches that particular application or widget, for instance a videoconference application, as shown in FIG. 3B .
- various icons or thumbnails 302 are presented on the closer screen while the content 304 such as the videoconference feed is presented on the other screen.
- FIG. 3C illustrates an instance where the videoconference is presented on upper screen 104 a while a spreadsheet or other information is presented on lower screen 104 b .
- thumbnails, icons or static content may be presented on one display screen while active content such as a videoconference feed is presented on the other display screen.
- FIGS. 4A-4F illustrate yet another scenario 400 involving content searching.
- FIG. 4A provides an example web browser UI 402 on screen 104 a while a virtual keyboard 404 , mousepad 406 and application-relevant icons 408 are presented on screen 104 b .
- a request or other information a particular piece of selected content 410 is presented on screen 104 a while other search results (e.g., thumbnail images, icons, text, hyperlinks) 412 are presented on screen 104 b , as shown in FIG. 4B .
- the other search results need not all be of the same size, and may differ in type of result (e.g., images versus text strings).
- the user may select different items in the screen 104 b for display on screen 104 a.
- FIGS. 4C and 4D show the user sliding his or her finger in a first direction (e.g., downward) along at least a portion of the screen 104 a .
- both of the display screens may transition to a different application.
- the user may alternatively close the current application by sliding his or her finger in a different direction (e.g., sideways) along at least a portion of the screen 104 a , as shown in FIGS. 4E and 4F .
- Scenario 500 provides exemplary operation of a drawing application.
- the user selects the drawing application from among a series of application displayed in overlapping fashion on display screen 104 a .
- Other ways of presenting multiple applications may be employed, e.g., in a carousel view, side by side views, stacked views, etc.
- the application is selected, e.g., by tapping on the application with the user's finger or providing verbal instructions to the device 100 to open that particular application, a graphical representation of that application appears to slide into or otherwise open up in display screen 104 b , as shown in FIG. 5B .
- selected content is presented in one of the two displays while interface elements are presented in the other display.
- FIG. 5C An example of this is shown in FIG. 5C , where an image is presented in display 104 a while a toolbar (e.g., color palate, brushes, line elements) are presented in display 104 b , which may be closer to the user than display 104 a .
- the user may flatten the upper screen so that both displays are flat and lie along the same plane.
- the user may manipulate or modify the image presented in display 104 a either by hand or using a tool such as a stylus on a touch-sensitive screen.
- FIGS. 6A-6D a first type of screen swapping 600 between the two display screens is illustrated in FIGS. 6A-6D .
- the user may instruct the client device to swap screens, for instance by pressing a button. This may be done via a “soft” button on one of the two display screens, or by pressing a physical button on one of the display housings. Alternatively, the physical button could be another actuator such as a switch, slider, etc., or could be done by voice activation or the like.
- one or more of the processors, user interface module and display interfaces causes the content of the two display screens to be swapped.
- FIG. 6D illustrate an example of counterclockwise swapping, where the appearance of the change in content gives the impress of a counterclockwise rotation of the two screens.
- the swapping could be in a clockwise direction, via a carousel-type rotation, etc.
- the result in FIG. 6D is that the toolbar (e.g., color palate, brushes, line elements) is now presented in display 104 a , while the image is presented in display 104 b (which may be closer to the user than the display 4 a ).
- FIGS. 7A-7G An example of physical display screen swapping is illustrated in scenario 700 of FIGS. 7A-7G .
- the user physically rotates both display housings, e.g., in a clockwise or counterclockwise direction, as shown in FIGS. 7B-7D .
- the rotation may stop at, for instance, 90 degrees from the prior position, which places the two display housings side by side instead of one over the other.
- the position and/or orientation sensors 128 detect the instantaneous placement of each display housing, and thus the system can determine the particular arrangement of the display screens.
- the processing system e.g., one or more of the processors, user interface module and display interfaces, causes the content of one or both of the display screens to be swapped. This is shown in FIGS.
- the content of interest e.g., active content
- the camera drawing is rotated in the left display screen by 90 degrees, or the same amount of rotation as the physical display housing was rotated.
- the editing tools of the application e.g., control elements
- the display screen including the user interface tools may rotate while the content of interest remains unrotated.
- the client device When the user is done with the application or wants to power down or shut off the client device, he or she can close it by folding one of the display housings on top of the other display housing as shown by embodiment 800 of FIGS. 8A-8B .
- the client device may be turned on, woken from sleep mode or otherwise activated by at least partly opening or separating the display housings from one another (not shown).
- content of interest e.g., selected content
- application tools or other input elements e.g., control elements
- Material including the selected content and the control elements is easily swapped between the two display screens, either by apparent rotation or movement of the presented material, or be detecting physical rotation or movement of the display housings.
- FIG. 9 is an example flow diagram 900 in accordance with some of the aspects described above that may be performed by one or more processors, either alone or in conjunction with the user interface module, display interfaces and other components of the client device.
- the process generates a first set of content items for presentation on a first display device of the multi-display computer system.
- the process generates a second set of content items for presentation on a second display device of the multi-display computer system.
- the process modifies at least one of the first and second sets of content items.
- the modified content items are presented on at least one of the first and second display devices, wherein presentation of the modified content includes swapping selected content between the first and second display devices.
- the first set of content such as an image, video or other item of interest for a given application (e.g., selected content)
- tools for that application e.g., control elements.
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 14/996,346, filed Jan. 15, 2016, the entire disclosure of which is incorporated herein by reference.
- Typical laptops, tablets and similar devices have either a fixed keyboard or a user interface (UI) that presents information in a particular location all the time. For instance, icons, buttons and menu items may always be presented along the bottom of the screen of the device. And while there are some existing dual screen devices, the UIs of both screens are typically separate from one another.
- Aspects of the disclosure provide different inputs of the UI are presented to the user on one or both of the screens depending on what the user is doing. This makes applications running on the dual screen device more clear and explicit about the various features, and making it easier to use the applications. For instance, control elements such as buttons and menus can be brought up front to the closer (lower) screen for quick input, while active content (e.g., video chat, full image, drawings) is presented on the farther (upper) screen.
- In accordance with aspects of the disclosure, a computer-implemented content presentation method for use in a multi-display computer system is provided. The method comprises generating, by one or more processors, a first set of content items for presentation on a first display device of the multi-display computer system; generating, by the one or more processors, a second set of content items for presentation on a second display device of the multi-display computer system; upon receiving input information, the one or more processing devices modifying at least one of the first and second sets of content items; and presenting the modified content items on at least one of the first and second display devices, wherein presentation of the modified content includes swapping selected content between the first and second display devices.
- In one example, modifying at least one of the first and second sets of content items is performed in response to an input received from a user of the multi-display computer system. Here, the received input may indicate either: a change in state for a currently running application; or a change in relative physical position of one or both of the first and second display devices.
- In another example, swapping the selected content includes interchanging one or more user interface elements between the first and second display devices. In a further example, swapping the selected content includes either (i) linearly scrolling the selected content between the first and second display devices or (ii) rotating the selected content in a clockwise or counterclockwise direction between the first and second display devices. And in yet another example, at least one of the first and second sets of content items corresponds to one selected from the group consisting of: (i) a log-in interface; (ii) one or more applications; (iii) one or more widgets; (iv) thumbnail images; (v) audio content; and (vi) video content.
- In one scenario, the received input information identifies that: an input device of the multi-display computer system has been pressed; an icon presented on either the first or the second display device has been selected; or an orientation of the multi-display computer system has changed from a first orientation to a second orientation. The orientation change may indicate a physical rotation of the first and second display screens.
- In another scenario, the first set of content items are presented on the first display device and the second set of content items are presented on the second display device when the first display device is closer to the user than the second display device. And in a further scenario, the first set of content items includes one or more interactive elements configured to receive input from a user, and the second set of content items includes active content being presented to the user.
- In accordance with other aspects of the disclosure, a multi-display client computing device is provided. The device comprises a first display housing including a first display device therein, a second display housing including a second display device therein, one or more position and orientation sensors operatively coupled to at least one of the first and second display housings, and one or more processors operatively coupled to the first and second display devices and to the one or more position and orientation sensors. The one or more processors are configured to generate a first set of content items for presentation on the first display device; generate a second set of content items for presentation on the second display device; upon receiving input information from an input source, modify at least one of the first and second sets of content items; and cause the modified content items to be presented on at least one of the first and second display devices. Presentation of the modified content includes swapping selected content between the first and second display devices.
- In one example, each of the first and second display devices includes a touch sensitive input, and the input source is the touch sensitive input of one of the first and second display devices. In another example, swapping the selected content includes either (i) linearly scrolling the selected content between the first and second display devices or (ii) rotating the selected content in a clockwise or counterclockwise direction between the first and second display devices. In a further example, the received input information identifies that an orientation of one or both of the first and second display housings has changed from a first orientation to a second orientation as detected by the one or more position and orientation sensors. Here, the orientation change may indicate a physical rotation of at least one of the first and second display housings. And upon detection of the physical rotation, the modified content may be rotated in at least one of the first and second display screens.
- In accordance with further aspects of the disclosure, a computer-implemented content presentation method for use in a multi-display computer system is provided. The method comprises preparing, by one or more processors, selected content for presentation on a first display device of the multi-display computer system; generating, by the one or more processors, one or more control elements for presentation on a second display device of the multi-display computer system, the control elements being operatively associated with the selected content to enable manipulation or modification of the selected content in response to an input signal; and performing, by the one or more processors, a screen swapping operation to move the selected content from the first display device to the second display device, and to concurrently move the one or more control elements from the second display device to the first display device.
- In one example, the screen swapping operation is performed in response to either a received instruction or signal provided to the multi-display computer system, or physical rotation of one or both of the first and second display devices. Here, upon detection of the physical rotation, an orientation of one or both of the selected content and the one or more control elements may be rotated in a corresponding one of the first and second display devices.
- And in another example, performing the screen swapping operation includes providing an appearance of either (i) linearly exchanging the selected content and the one or more control elements or (ii) rotating the selected content and the one or more control elements in a clockwise or counterclockwise direction.
-
FIGS. 1A-B illustrate an example dual display screen client device according to aspects of the disclosure. -
FIGS. 2A-G illustrate an example log-in use according to aspects of the disclosure. -
FIGS. 3A-C illustrate an example application launch according to aspects of the disclosure. -
FIGS. 4A-4F illustrate examples of content searching and application switching and closing according to aspects of the disclosure. -
FIGS. 5A-5E illustrate operation of an example drawing application on a dual-screen arrangement according to aspects of the disclosure. -
FIGS. 6A-D illustrate a first example of display screen content switching in accordance with an exemplary embodiment. -
FIGS. 7A-G illustrate a second example of display screen content switching in accordance with an exemplary embodiment. -
FIGS. 8A-B illustrate operation of a dual screen display device in accordance with an exemplary embodiment. -
FIG. 9 is a flow diagram of an example method according to aspects of the disclosure. - The technology relates to content presentation in a dual-screen type client device, such as a laptop, tablet, netbook or other type of portable client device. As noted above, inputs for the UI may be presented one or both of the display screens depending on how a user is interacting with the client device and the type(s) of program(s) being used. Content or information switching between the screens is provided in a seamless manner, which provides the user with an enjoyable interactive experience that can benefit productivity.
- Generally, there are two general kinds of switching that may occur. One is UI switching, in which icons, content, control elements, etc., are swapped between the two physical display screens. This could be done, for example, by counterclockwise or clockwise rotation of the items, or up/down scrolling of the items, between two (stationary) screens. The other kind of switching is physical display screen switching. For instance, when the physical display screens are moved from 2 vertical screens to 2 horizontal screens, items on at least one of the two screens are also rotated for ease of interaction. This enables an application to provide the necessary tools to the user as he/she needs them.
- These features can be implemented in a variety of use cases. Such use cases include logging into an account, application or widget selection, enhanced image or web browsing, video conferencing, and more. These types of use cases are discussed in detail below. Before that, an example multi-screen system is presented.
-
FIGS. 1A and 1B illustrate an example dualscreen client device 100. As shown inFIG. 1A , thedevice 100 includes afirst display housing 102 a and asecond display housing 102 b. Each housing includes arespective display device FIG. 1B , each display housing 102 also includes respective user input(s) 106 a, 106 b, anddisplay interfaces - Other components of the
client device 100 are also shown inFIG. 1B . These include one ormore computer processors 110 such as acentral processing unit 112 and/orgraphics processors 114, as well asmemory 116 configured to storeinstructions 118 anddata 120. The processors may or may not operate in parallel, and may include ASICs, controllers and other types of hardware circuitry. The processors are configured to receive information from a user through the user inputs 106 anduser interface module 122, and to present information to the user on the display devices 104 via the display interfaces 108.User interface module 122 may receive commands from a user via the user inputs and convert them for submission to a given processor. Each display interface may comprise appropriate circuitry for driving the corresponding display device to present graphical and other information to the user. By way of example, the graphical information may be generated by thegraphics processors 114, whileCPU 112 manages overall operation of theclient device 100. -
Memory 116 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Thememory 116 may include, for example, flash memory and/or NVRAM, and may be embodied as a hard-drive or memory card. Alternatively thememory 116 may also include DVD, CD-ROM, write-capable, and read-only memories. In one implementation, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions, such asinstructions 118 that, when executed by one or more processors, perform one or more methods such as those described herein. The information carrier is a computer- or machine-readable medium, such asmemory 116. AlthoughFIG. 1B functionally illustrates the processor(s), memory, and other elements ofdevice 100 as being within the same overall block, such components may or may not be stored within the same physical housing. - The
data 120 may be retrieved, stored or modified by the processors in accordance with theinstructions 118. For instance, although the claimed subject matter is not limited by any particular data structure, the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computing device-readable format. - The
instructions 118 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor(s). For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor(s), or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below. - As also shown in
FIG. 1B , the dual-display computing device 100 includes one or more communication devices for communicating with other devices and systems. The communication devices include one or both ofwireless transceiver 124 and/orwired transceiver 126, which may provide a local area network (LAN) connection. Thedevice 100 may communicate with other remote devices via these connections using various configurations and protocols, including short range communication protocols such as near-field communication, Bluetooth, Bluetooth LE, the Internet, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. - In addition, the
device 100 as shown includes one or more position andorientation sensors 128. The position andorientation sensors 128 are configured to determine the position and orientation ofclient computing device 100. For example, these components may include a GPS receiver to determine the device's latitude, longitude and/or altitude as well as an accelerometer, gyroscope or another direction/speed detection device. Each display housing 102 may include its own set of position andorientation sensors 128. Thedevice 100 also includes one or more camera(s) 130 for capturing still images and recording video streams, speaker(s) 132 and apower module 134. - As noted above, features of the technology can be implemented in a variety of use cases. Examples of such scenarios are discussed below and with reference to the accompanying figures.
-
FIGS. 2A-G illustrate onescenario 200 for when a user logs onto theclient device 100 or logs into a particular application. In this case, as shown inFIG. 2A , the user may open or otherwise access thedevice 100, which provides aninitial login 202 ondisplay screen 104 b, which is the display screen closest to the user. Theinitial login 202 provides one or more options for the user to log in with. For instance, as shown, different user icons are presented. - Then, as shown in
FIG. 2B , the user may select a particular user icon, for example by pressing the display screen with his or her finger. Next, as shown inFIG. 2C , the selected icon representing the user's account begins “fly up” todisplay screen 104 a. And as shown inFIG. 2D , the “fly up” process from one display screen to the other completes. - At this point, the user is able to log into his or her account. Here, as shown in
FIG. 2E , a login entry field orbox 204 is presented ondisplay screen 104 a, while a virtual keyboard or otheraccount entry information 206 appears ondisplay screen 104 b. Once the user enters his or her account authentication information, the login may be accepted as shown inFIG. 2F . Here, as shown inFIGS. 2F and 2G , the virtual keyboard element may be replaced by icons or other elements ofinterest 208 on thedisplay screen 104 b, whilecertain information items 210 appear on thedisplay screen 104 a.FIG. 2F illustrates one example where the virtual keyboard appears to slide toward the user and out of the front of thedevice 100 as it fades away. Theelements 208 anditems 210 ofFIG. 2G may be associated with various menu components of the current program or application that the user has logged onto. The user may access different submenus or other aspects of the program by selecting theelements 208 on one screen, while observing program status of other items of interest on the other screen. By way of example, the icons or other elements ofinterest 208, such as control elements, are easily accessible to the user via a first, closer display screen, while content such as images, videos, etc., are presented on the second, viewing screen. -
FIGS. 3A-C illustrate anotherscenario 300 for launching an application or a widget. In this case, as shown inFIG. 3A , the user chooses one of theelements 208 on the closer display screen, e.g.,display screen 104 b. This choice launches that particular application or widget, for instance a videoconference application, as shown inFIG. 3B . Here, various icons orthumbnails 302, such as control elements, are presented on the closer screen while thecontent 304 such as the videoconference feed is presented on the other screen.FIG. 3C illustrates an instance where the videoconference is presented onupper screen 104 a while a spreadsheet or other information is presented onlower screen 104 b. Thus in this scenario, thumbnails, icons or static content may be presented on one display screen while active content such as a videoconference feed is presented on the other display screen. -
FIGS. 4A-4F illustrate yet anotherscenario 400 involving content searching. For instance,FIG. 4A provides an exampleweb browser UI 402 onscreen 104 a while avirtual keyboard 404,mousepad 406 and application-relevant icons 408 are presented onscreen 104 b. In response to a user search, a request or other information, a particular piece of selectedcontent 410 is presented onscreen 104 a while other search results (e.g., thumbnail images, icons, text, hyperlinks) 412 are presented onscreen 104 b, as shown inFIG. 4B . The other search results need not all be of the same size, and may differ in type of result (e.g., images versus text strings). The user may select different items in thescreen 104 b for display onscreen 104 a. - The user may switch applications by sliding the currently displayed content off of the screen. Here, for example,
FIGS. 4C and 4D show the user sliding his or her finger in a first direction (e.g., downward) along at least a portion of thescreen 104 a. As this is done, both of the display screens may transition to a different application. The user may alternatively close the current application by sliding his or her finger in a different direction (e.g., sideways) along at least a portion of thescreen 104 a, as shown inFIGS. 4E and 4F . - Another scenario is illustrated in
FIGS. 5A-5E .Scenario 500 provides exemplary operation of a drawing application. Here, as shown inFIG. 5A , the user selects the drawing application from among a series of application displayed in overlapping fashion ondisplay screen 104 a. Other ways of presenting multiple applications may be employed, e.g., in a carousel view, side by side views, stacked views, etc. Once the application is selected, e.g., by tapping on the application with the user's finger or providing verbal instructions to thedevice 100 to open that particular application, a graphical representation of that application appears to slide into or otherwise open up indisplay screen 104 b, as shown inFIG. 5B . - In order to enable easy operation of the application, selected content is presented in one of the two displays while interface elements are presented in the other display. An example of this is shown in
FIG. 5C , where an image is presented indisplay 104 a while a toolbar (e.g., color palate, brushes, line elements) are presented indisplay 104 b, which may be closer to the user thandisplay 104 a. As shown inFIG. 5D , the user may flatten the upper screen so that both displays are flat and lie along the same plane. And as seen inFIG. 5E , the user may manipulate or modify the image presented indisplay 104 a either by hand or using a tool such as a stylus on a touch-sensitive screen. - In accordance with another scenario, a first type of screen swapping 600 between the two display screens is illustrated in
FIGS. 6A-6D . As shown inFIG. 6A , the user may instruct the client device to swap screens, for instance by pressing a button. This may be done via a “soft” button on one of the two display screens, or by pressing a physical button on one of the display housings. Alternatively, the physical button could be another actuator such as a switch, slider, etc., or could be done by voice activation or the like. In response to this instruction or signal, one or more of the processors, user interface module and display interfaces causes the content of the two display screens to be swapped.FIGS. 6B-6D illustrate an example of counterclockwise swapping, where the appearance of the change in content gives the impress of a counterclockwise rotation of the two screens. Alternatively, the swapping could be in a clockwise direction, via a carousel-type rotation, etc. The result inFIG. 6D is that the toolbar (e.g., color palate, brushes, line elements) is now presented indisplay 104 a, while the image is presented indisplay 104 b (which may be closer to the user than the display 4 a). - An example of physical display screen swapping is illustrated in
scenario 700 ofFIGS. 7A-7G . Here, the user physically rotates both display housings, e.g., in a clockwise or counterclockwise direction, as shown inFIGS. 7B-7D . The rotation may stop at, for instance, 90 degrees from the prior position, which places the two display housings side by side instead of one over the other. The position and/ororientation sensors 128 detect the instantaneous placement of each display housing, and thus the system can determine the particular arrangement of the display screens. Either during or after the rotation by the user, the processing system, e.g., one or more of the processors, user interface module and display interfaces, causes the content of one or both of the display screens to be swapped. This is shown inFIGS. 7E-7G . Here, in this example, only the content of interest (e.g., active content)—the camera drawing—is rotated in the left display screen by 90 degrees, or the same amount of rotation as the physical display housing was rotated. The editing tools of the application (e.g., control elements) may remain in place or may also be rotated. Alternatively, the display screen including the user interface tools may rotate while the content of interest remains unrotated. - When the user is done with the application or wants to power down or shut off the client device, he or she can close it by folding one of the display housings on top of the other display housing as shown by
embodiment 800 ofFIGS. 8A-8B . The client device may be turned on, woken from sleep mode or otherwise activated by at least partly opening or separating the display housings from one another (not shown). - These various scenarios and examples show that, in accordance with certain embodiments, content of interest (e.g., selected content) is presented to the user in one display screen while application tools or other input elements (e.g., control elements) are presented to the user in the other display screen. Material including the selected content and the control elements is easily swapped between the two display screens, either by apparent rotation or movement of the presented material, or be detecting physical rotation or movement of the display housings.
-
FIG. 9 is an example flow diagram 900 in accordance with some of the aspects described above that may be performed by one or more processors, either alone or in conjunction with the user interface module, display interfaces and other components of the client device. Inblock 902, the process generates a first set of content items for presentation on a first display device of the multi-display computer system. Inblock 904, the process generates a second set of content items for presentation on a second display device of the multi-display computer system. Then inblock 906, upon receiving input information, the process modifies at least one of the first and second sets of content items. And inblock 908, the modified content items are presented on at least one of the first and second display devices, wherein presentation of the modified content includes swapping selected content between the first and second display devices. The first set of content, such as an image, video or other item of interest for a given application (e.g., selected content), may be swapped with tools for that application (e.g., control elements). This way, depending on what the user is doing, the process enables the dual-screen display system to provide the different types of content items (e.g., items of interest versus application tools) to a user on each display device in a readily usable manner. Swapping of the content can be initiated by the user by pressing a button or other actuator on either display housing, or by physically rotating the display housings. - The logic and process flows depicted in the figures and described herein are not limited to a particular order or sequence unless expressly stated. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems.
- Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/716,576 US20220230603A1 (en) | 2016-01-15 | 2022-04-08 | Adaptable user interface with dual screen device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/996,346 US11335302B2 (en) | 2016-01-15 | 2016-01-15 | Adaptable user interface with dual screen device |
US17/716,576 US20220230603A1 (en) | 2016-01-15 | 2022-04-08 | Adaptable user interface with dual screen device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/996,346 Continuation US11335302B2 (en) | 2016-01-15 | 2016-01-15 | Adaptable user interface with dual screen device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220230603A1 true US20220230603A1 (en) | 2022-07-21 |
Family
ID=59311377
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/996,346 Active 2039-03-19 US11335302B2 (en) | 2016-01-15 | 2016-01-15 | Adaptable user interface with dual screen device |
US17/716,576 Abandoned US20220230603A1 (en) | 2016-01-15 | 2022-04-08 | Adaptable user interface with dual screen device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/996,346 Active 2039-03-19 US11335302B2 (en) | 2016-01-15 | 2016-01-15 | Adaptable user interface with dual screen device |
Country Status (6)
Country | Link |
---|---|
US (2) | US11335302B2 (en) |
EP (1) | EP3403174A4 (en) |
CN (1) | CN108139875A (en) |
AU (3) | AU2016386036A1 (en) |
CA (1) | CA3001741A1 (en) |
WO (1) | WO2017123320A1 (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD762692S1 (en) * | 2014-09-02 | 2016-08-02 | Apple Inc. | Display screen or portion thereof with graphical user interface |
WO2019045144A1 (en) * | 2017-08-31 | 2019-03-07 | (주)레벨소프트 | Medical image processing apparatus and medical image processing method which are for medical navigation device |
CN108829304A (en) * | 2018-05-29 | 2018-11-16 | 维沃移动通信有限公司 | A kind of display control method and terminal |
US10860065B2 (en) * | 2018-11-15 | 2020-12-08 | Dell Products, L.P. | Multi-form factor information handling system (IHS) with automatically reconfigurable hardware keys |
CN109814825B (en) * | 2018-12-29 | 2021-01-08 | 维沃移动通信有限公司 | Display screen control method and mobile terminal |
CN109547603B (en) * | 2018-12-29 | 2021-09-21 | 维沃移动通信有限公司 | Terminal device |
US11347367B2 (en) | 2019-01-18 | 2022-05-31 | Dell Products L.P. | Information handling system see do user interface management |
US11169653B2 (en) | 2019-01-18 | 2021-11-09 | Dell Products L.P. | Asymmetric information handling system user interface management |
US11009907B2 (en) | 2019-01-18 | 2021-05-18 | Dell Products L.P. | Portable information handling system user interface selection based on keyboard configuration |
CN109739407B (en) * | 2019-01-25 | 2020-11-17 | 维沃移动通信有限公司 | Information processing method and terminal equipment |
CN110032411A (en) * | 2019-02-28 | 2019-07-19 | 努比亚技术有限公司 | Using intelligent display control method, terminal and computer readable storage medium |
CN110191301A (en) * | 2019-05-06 | 2019-08-30 | 珠海格力电器股份有限公司 | A kind of video calling control method, device, terminal and storage medium |
US11205286B2 (en) | 2019-07-16 | 2021-12-21 | Microsoft Technology Licensing, Llc | Techniques for optimizing creation of digital diagrams |
CN112786036B (en) * | 2019-11-04 | 2023-08-08 | 海信视像科技股份有限公司 | Display device and content display method |
CN113453080A (en) * | 2020-03-27 | 2021-09-28 | 海信视像科技股份有限公司 | Display device and display method of video chat window |
WO2021189400A1 (en) * | 2020-03-27 | 2021-09-30 | 海信视像科技股份有限公司 | Display device, and display method of video chat window |
CN112004049B (en) * | 2020-08-18 | 2022-06-28 | 北京字节跳动网络技术有限公司 | Double-screen different display method and device and electronic equipment |
CN114911308A (en) * | 2021-02-09 | 2022-08-16 | 英业达科技有限公司 | Notebook computer |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040207568A1 (en) * | 2003-03-31 | 2004-10-21 | Hitachi, Ltd. | Portable information processing apparatus and method for displaying image |
US20100141681A1 (en) * | 2008-12-04 | 2010-06-10 | Nintendo Co., Ltd. | Information processing device that can be held at least in two directions for use |
US20110294580A1 (en) * | 2010-05-28 | 2011-12-01 | Namco Bandai Games Inc. | Program, information storage medium, and computer system |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US20140101577A1 (en) * | 2012-10-10 | 2014-04-10 | Samsung Electronics Co., Ltd. | Multi display apparatus and method of controlling display operation |
US20140152576A1 (en) * | 2012-10-10 | 2014-06-05 | Samsung Electronics Co., Ltd | Multi display apparatus, input pen, multi display apparatus controlling method, and multi display system |
US20140195957A1 (en) * | 2013-01-07 | 2014-07-10 | Lg Electronics Inc. | Image display device and controlling method thereof |
US20140282059A1 (en) * | 2013-03-14 | 2014-09-18 | Samsung Electronics Co., Ltd. | Electronic device and operating method thereof |
US20150212647A1 (en) * | 2012-10-10 | 2015-07-30 | Samsung Electronics Co., Ltd. | Head mounted display apparatus and method for displaying a content |
US9323446B2 (en) * | 2011-06-20 | 2016-04-26 | Samsung Electronics Co., Ltd. | Apparatus including a touch screen and screen change method thereof |
US10845843B1 (en) * | 2019-10-30 | 2020-11-24 | Pioneer Square Brands, Inc. | Case for portable electronic computing device |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US20040021681A1 (en) * | 2002-07-30 | 2004-02-05 | Liao Chin-Hua Arthur | Dual-touch-screen mobile computer |
US9092190B2 (en) | 2010-10-01 | 2015-07-28 | Z124 | Smartpad split screen |
US10152190B2 (en) | 2003-12-15 | 2018-12-11 | Open Invention Network, Llc | Systems and methods for improved application sharing in a multimedia collaboration session |
US20070182663A1 (en) * | 2004-06-01 | 2007-08-09 | Biech Grant S | Portable, folding and separable multi-display computing system |
JP2006053678A (en) * | 2004-08-10 | 2006-02-23 | Toshiba Corp | Electronic equipment with universal human interface |
JP2006053629A (en) * | 2004-08-10 | 2006-02-23 | Toshiba Corp | Electronic equipment, control method and control program |
JP5550211B2 (en) | 2005-03-04 | 2014-07-16 | アップル インコーポレイテッド | Multi-function handheld device |
JP2006311224A (en) | 2005-04-28 | 2006-11-09 | Nec Saitama Ltd | Folding mobile phone |
US7844301B2 (en) | 2005-10-14 | 2010-11-30 | Lg Electronics Inc. | Method for displaying multimedia contents and mobile communications terminal capable of implementing the same |
US20090213081A1 (en) | 2007-01-10 | 2009-08-27 | Case Jr Charlie W | Portable Electronic Device Touchpad Input Controller |
JP5140867B2 (en) | 2007-06-21 | 2013-02-13 | Necカシオモバイルコミュニケーションズ株式会社 | Electronic device and program |
CN101470469A (en) | 2007-12-25 | 2009-07-01 | 诺伯·古斯奇 | Folding computer |
US20090322690A1 (en) | 2008-06-30 | 2009-12-31 | Nokia Corporation | Screen display |
CN101477442A (en) | 2009-02-16 | 2009-07-08 | 重庆大学 | Display status switch method and structure of portable computer |
US8355755B2 (en) | 2009-03-03 | 2013-01-15 | Lg Electronics Inc. | Mobile terminal |
US8446377B2 (en) | 2009-03-24 | 2013-05-21 | Microsoft Corporation | Dual screen portable touch sensitive computing system |
US9092115B2 (en) | 2009-09-23 | 2015-07-28 | Microsoft Technology Licensing, Llc | Computing system with visual clipboard |
JP5155287B2 (en) | 2009-12-02 | 2013-03-06 | シャープ株式会社 | Operating device, electronic device equipped with the operating device, image processing apparatus, and operating method |
US20110210922A1 (en) * | 2010-02-26 | 2011-09-01 | Research In Motion Limited | Dual-screen mobile device |
JP2011203873A (en) | 2010-03-24 | 2011-10-13 | Nec Corp | Information processing apparatus, display switching method, program and recording medium |
TWI410859B (en) | 2010-05-28 | 2013-10-01 | Quanta Comp Inc | Method for swapping display contents between multiple screens |
US9405444B2 (en) * | 2010-10-01 | 2016-08-02 | Z124 | User interface with independent drawer control |
US20130050265A1 (en) * | 2011-08-31 | 2013-02-28 | Z124 | Gravity drop |
CN103250115A (en) * | 2010-11-17 | 2013-08-14 | Flex Electronics ID Co.,Ltd. | Multi-screen email client |
US9335793B2 (en) * | 2011-01-31 | 2016-05-10 | Apple Inc. | Cover attachment with flexible display |
TWI435220B (en) | 2011-03-25 | 2014-04-21 | Wistron Corp | Dual-screen portable computer and a switching method of the same |
EP2565751A1 (en) | 2011-08-31 | 2013-03-06 | Z124 | Multi-screen display control |
US20130076654A1 (en) * | 2011-09-27 | 2013-03-28 | Imerj LLC | Handset states and state diagrams: open, closed transitional and easel |
CN103218109A (en) | 2011-11-28 | 2013-07-24 | 马维尔国际有限公司 | Dual-window solution for android operating system |
JP5998849B2 (en) * | 2012-01-18 | 2016-09-28 | 株式会社リコー | Electronic device, information processing system, information management apparatus, information processing method, and information processing program |
US9029179B2 (en) * | 2012-06-28 | 2015-05-12 | Analog Devices, Inc. | MEMS device with improved charge elimination and methods of producing same |
CN203054674U (en) | 2012-12-19 | 2013-07-10 | 王文韬 | Personal computer with screen type keyboard |
JPWO2014097505A1 (en) | 2012-12-19 | 2017-01-12 | 日本電気株式会社 | Mobile terminal, display control method, and program |
KR20140098384A (en) * | 2013-01-31 | 2014-08-08 | 삼성전자주식회사 | Portable apparatus having a plurality of touch screens and sound output method thereof |
US20150363069A1 (en) | 2013-03-15 | 2015-12-17 | Nec Corporation | Display control |
EP3872599A1 (en) * | 2014-05-23 | 2021-09-01 | Samsung Electronics Co., Ltd. | Foldable device and method of controlling the same |
CN105242869A (en) | 2015-09-23 | 2016-01-13 | 宇龙计算机通信科技(深圳)有限公司 | Dual-screen interaction method for user terminal and user terminal |
US10282090B2 (en) * | 2015-09-30 | 2019-05-07 | Apple Inc. | Systems and methods for disambiguating intended user input at an onscreen keyboard using dual strike zones |
-
2016
- 2016-01-15 US US14/996,346 patent/US11335302B2/en active Active
- 2016-11-14 EP EP16885398.4A patent/EP3403174A4/en active Pending
- 2016-11-14 CN CN201680059474.0A patent/CN108139875A/en active Pending
- 2016-11-14 CA CA3001741A patent/CA3001741A1/en active Pending
- 2016-11-14 WO PCT/US2016/061806 patent/WO2017123320A1/en unknown
- 2016-11-14 AU AU2016386036A patent/AU2016386036A1/en not_active Abandoned
-
2020
- 2020-02-18 AU AU2020201149A patent/AU2020201149B2/en active Active
-
2021
- 2021-05-18 AU AU2021203197A patent/AU2021203197A1/en not_active Abandoned
-
2022
- 2022-04-08 US US17/716,576 patent/US20220230603A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040207568A1 (en) * | 2003-03-31 | 2004-10-21 | Hitachi, Ltd. | Portable information processing apparatus and method for displaying image |
US20100141681A1 (en) * | 2008-12-04 | 2010-06-10 | Nintendo Co., Ltd. | Information processing device that can be held at least in two directions for use |
US20110294580A1 (en) * | 2010-05-28 | 2011-12-01 | Namco Bandai Games Inc. | Program, information storage medium, and computer system |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US9323446B2 (en) * | 2011-06-20 | 2016-04-26 | Samsung Electronics Co., Ltd. | Apparatus including a touch screen and screen change method thereof |
US20140101577A1 (en) * | 2012-10-10 | 2014-04-10 | Samsung Electronics Co., Ltd. | Multi display apparatus and method of controlling display operation |
US20140152576A1 (en) * | 2012-10-10 | 2014-06-05 | Samsung Electronics Co., Ltd | Multi display apparatus, input pen, multi display apparatus controlling method, and multi display system |
US20150212647A1 (en) * | 2012-10-10 | 2015-07-30 | Samsung Electronics Co., Ltd. | Head mounted display apparatus and method for displaying a content |
US20140195957A1 (en) * | 2013-01-07 | 2014-07-10 | Lg Electronics Inc. | Image display device and controlling method thereof |
US20140282059A1 (en) * | 2013-03-14 | 2014-09-18 | Samsung Electronics Co., Ltd. | Electronic device and operating method thereof |
US10845843B1 (en) * | 2019-10-30 | 2020-11-24 | Pioneer Square Brands, Inc. | Case for portable electronic computing device |
Also Published As
Publication number | Publication date |
---|---|
WO2017123320A1 (en) | 2017-07-20 |
AU2020201149A1 (en) | 2020-03-05 |
AU2016386036A1 (en) | 2018-04-19 |
CN108139875A (en) | 2018-06-08 |
US20170206861A1 (en) | 2017-07-20 |
AU2021203197A1 (en) | 2021-06-10 |
CA3001741A1 (en) | 2017-07-20 |
EP3403174A4 (en) | 2019-12-04 |
AU2020201149B2 (en) | 2021-06-24 |
EP3403174A1 (en) | 2018-11-21 |
US11335302B2 (en) | 2022-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220230603A1 (en) | Adaptable user interface with dual screen device | |
US11340759B2 (en) | User terminal device with pen and controlling method thereof | |
US9865224B2 (en) | Transparent display apparatus and display method thereof | |
KR102061881B1 (en) | Multi display apparatus and method for controlling display operation | |
KR102083937B1 (en) | Multi display device and method for providing tool thereof | |
US20160098146A1 (en) | Operating a touch screen control system according to a plurality of rule sets | |
US9361130B2 (en) | Systems, methods, and computer program products providing an integrated user interface for reading content | |
EP2720141A1 (en) | Display apparatus and method of controlling display thereof | |
US10839572B2 (en) | Contextual virtual reality interaction | |
CN110377196A (en) | Electronic equipment and its control method | |
US20130132878A1 (en) | Touch enabled device drop zone | |
US20150067540A1 (en) | Display apparatus, portable device and screen display methods thereof | |
US11099731B1 (en) | Techniques for content management using a gesture sensitive element | |
KR20170066916A (en) | Electronic apparatus and controlling method of thereof | |
US20130314348A1 (en) | Electronic device | |
TWI430651B (en) | Intelligent input system and input device and electronic equipment thereof | |
Sun et al. | Controlling smart tvs using touch gestures on mobile devices | |
US20190354263A1 (en) | Method and system for displaying and navigating through digital content using virtual sphere | |
Hu et al. | The effects of screen size on rotating 3D contents using compound gestures on a mobile device | |
US20180129466A1 (en) | Display control device and display system | |
TWI432015B (en) | Intelligent input method | |
TWI432016B (en) | Intelligent input system and method | |
Nagata et al. | Implementation and evaluation of the gesture interface for object based e-learning system | |
Nishimoto | Multi-User Interface for Scalable Resolution Touch Walls | |
TWM496166U (en) | Mobile device interface control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:059658/0050 Effective date: 20170929 Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROJAS, BERNARDO NUNEZ;REEL/FRAME:059548/0125 Effective date: 20160115 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |