CN114585996A - Mobile applications on multi-screen computing devices - Google Patents

Mobile applications on multi-screen computing devices Download PDF

Info

Publication number
CN114585996A
CN114585996A CN202080069791.7A CN202080069791A CN114585996A CN 114585996 A CN114585996 A CN 114585996A CN 202080069791 A CN202080069791 A CN 202080069791A CN 114585996 A CN114585996 A CN 114585996A
Authority
CN
China
Prior art keywords
application
display
touch input
computing device
instructions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080069791.7A
Other languages
Chinese (zh)
Inventor
E·松尼诺
S·D·舍纳
S·L·戴维斯
S·E·罗德日古兹维根
O·J·C·托米
T·罗德斯
Y·S·金
P·C·帕奈
T·C·诺亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of CN114585996A publication Critical patent/CN114585996A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1681Details related solely to hinges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2356/00Detection of the display position w.r.t. other display screens

Abstract

Examples relate to operating a user interface of a dual-screen computing device. One example provides a computing device comprising a first portion comprising a first display and a first touch sensor and a second portion comprising a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display. The computing device is configured to receive, at a first display, a touch input moving an application currently displayed on the first display but not on a second display to the second display, detect the touch input releasing the application within a predetermined area, and expand the application across the first display and the second display.

Description

Mobile applications on multi-screen computing devices
Background
Some mobile electronic devices, such as smartphones and tablets, have a monolithic handheld shape with a display occupying substantially the entire front face of the device. Other devices, such as laptop computers, include hinges that connect the display to other hardware, such as a keyboard and cursor controller (e.g., a touchpad).
Disclosure of Invention
Examples are disclosed that relate to operating a user interface of a multi-screen computing device. One example provides a computing device comprising a first portion comprising a first display and a first touch sensor and a second portion comprising a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display. The computing device is configured to receive, at a first display, a touch input to move an application currently displayed on the first display but not on a second display to the second display, detect that the touch input releases the application within a predetermined area, expand the application across the first display and the second display such that a portion of application content is hidden behind the seam, receive the touch input to move an expanded application, and move the expanded application in a direction of the touch input to reveal at least a portion of the application content that is hidden behind the seam.
Another example provides a computing device comprising a first portion comprising a first display and a first touch sensor, a second portion comprising a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display, a logic device, and a storage device holding instructions executable by the logic machine. The instructions are executable to receive a touch input at the first display moving an application from the first display to the second display, move the application to the second display when the touch input releases the application within a first predefined area, and expand the application by displaying the application across the first display and the second display when the touch input releases the application within a second predefined area.
Another example provides a method implemented on a computing device that includes a first portion including a first display and a first touch sensor and a second portion including a second display and a second touch sensor, the second portion connected to the first portion via a hinge. The method includes displaying a first application on a first display, displaying a second application on a second display, receiving a touch input at the first display moving the first application to the second display, detecting the touch input releasing the application within a predetermined area, and superimposing the first application over the second application based at least on the touch input releasing the application within the predetermined area.
This disclosure is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Drawings
FIG. 1 illustrates an example multi-screen computing device.
2-5 illustrate example gestures for a dual-screen computing device.
Fig. 6A and 6B illustrate an example expansion of an application and an example surfacing of content of the expanded application hidden by a hinge.
7-9 illustrate example zooms of an application dragging the application in response to user touch input.
FIG. 10 shows a plot of an example vertical movement of touch position as a function of application in the Y direction.
FIG. 11 shows a plot of an example application zoom as a function of touch position in the Y direction.
FIGS. 12 and 13 illustrate example hint images displayed based on a touch input moving an application from one display to another.
14-17 illustrate example hint images displayed across various threshold distances based on touch input.
FIG. 18 shows an example hint image displayed in response to a touch input for a overlay application.
FIG. 19 shows an example hint image displayed in response to a touch input for an exchange application.
FIG. 20 illustrates an example user interface when an application is overlaid and then removed from the overlay within a threshold time.
FIG. 21 shows an example user interface depicting launching a new task from a link in an application when another application is opened on another display.
FIG. 22 shows an example user interface that depicts expanding an application above another application and then shrinking the application to thereby restore the application below the expanded application.
FIG. 23 shows an example user interface that depicts expanding an application and collapsing the expanded application to either display when another application is open on the other display, where swapping occurs with the expanded application collapsed to the other display.
FIG. 24 shows an example user interface that depicts expanding an application and collapsing the expanded application to either display when two other applications are overlaid on the other display, where swapping occurs if the expanded application is collapsed to the other display.
FIG. 25 shows an example user interface depicting an application expanding and contracting when a second application is overlaid behind the application and a third application is opened on another screen.
FIG. 26 shows an example user interface depicting expanding and contracting an application when a second application is overlaid behind the application and two other applications are overlaid on another screen.
FIG. 27 illustrates an example method of operating a dual-screen computing device.
28A-28E illustrate other example methods of operating a dual-screen computing device.
29A-29B illustrate example user interfaces depicting interaction with an application folder, where a stationary application automatically moves based on interaction with the application folder.
FIG. 30 illustrates another example method of operating a dual-screen computing device.
FIG. 31 illustrates a block diagram of an example computing system.
Detailed Description
The disclosed examples relate to a computing device having a multi-screen configuration. FIG. 1 illustrates an example multi-screen computing device in the form of a dual-screen computing device 100. Computing device 100 includes a first portion 102 and a second portion 104 that include a first display 106 and a second display 108, respectively. The hinge 110, which is arranged between the first and second portions 102 and 104, allows the relative pose between the portions and their displays to be adjusted. Computing device 100 may be configured to determine a relative pose between first and second portions 102 and 104 (e.g., via motion sensor data from one or more motion sensors in each portion) and adjust a function of the computing device based on the relative pose.
Each of the first display 106 and the second display 108 may be a touch sensitive display with touch sensors. The touch sensor(s) may be configured to sense multiple sources of touch input (such as a user's finger and a user-manipulated stylus) and may sense multiple concurrent touches. Computing device 100 may take any suitable form, including but not limited to various mobile devices (e.g., a foldable smartphone, tablet, or laptop computer).
The first portion 102 includes a first three-dimensional pose sensor system 114 configured to provide an output indicative of a three-dimensional pose of the first portion 102, and the second portion 104 includes a second three-dimensional pose sensor system 116 configured to provide an output indicative of a three-dimensional pose of the second portion 104. In some examples, first and second attitude systems 114 and 116 each include an accelerometer and a gyroscope and optionally a magnetometer. The outputs produced by the first and second pose systems 114 and 116 may be used to determine the three-dimensional pose of the first and second portions 102 and 104, respectively. In other examples, any other suitable sensor or sensors (such as optical or mechanical encoders incorporated into the hinge) may be used to sense the relative orientation of the display.
2-5 illustrate various example display poses for a dual-screen computing device 100. More specifically, fig. 2 shows a single portrait pose (where either display 106 or 108 may face the user), fig. 3 shows a dual portrait pose, fig. 4 shows a single landscape pose (where either display 106 or 108 may face the user), and fig. 5 shows a dual landscape pose. Depending on the pose of computing device 100, the user interface of computing device 100 may be configured to operate in a certain manner and/or respond differently to touch inputs. For example, in fig. 2 and 4, the second portion 104 may be folded behind the first portion 102 via the hinge 110, or vice versa. From the perspective of a user of computing device 100, second display 108 may not be perceptible when the second display is folded behind the first portion. As such, the computing device 100 may render graphical content previously displayed across both the first display 106 and the second display 108 entirely onto the first display 106, may cease displaying images on the second display 108, and/or may also cease receiving touch input on the second display 108. Various examples of operating a user interface and responding to touch inputs based on the gestures of a computing device are described further below.
In some examples, the touch input may spread the application across both the first display and the second display of the dual-screen device. Fig. 6A and 6B illustrate example user interfaces displayed when the application 600 is deployed across the first display 106 and the second display 108 of the first portion 102 and the second portion 104 of the dual-screen device 100. First, the user interface at 602 shows the application 600 open on the second display 108 of the second portion 104 while the first display 106 on the first portion 102 is unoccupied by any application. At 604, the user interface shows that the touch input 606 has dragged the application 600 from the second display 604 to the first display 602. The touch input 606 may originate at a location along an application navigation bar 608 displayed at the bottom of the application 600. The application navigation bar 608 may be configured to receive user input related to moving, dismissing, or otherwise manipulating the application 600. The touch input 606 may release the application 600 when the application 600 is in a predetermined area (e.g., within a threshold distance from a seam 610 between the two displays (as defined by the hinge 110 connecting the two device portions)). Example predefined areas for triggering certain user interface actions are discussed in more detail with reference to fig. 12-17.
The user interface at 612 shows the application 600 being unrolled across the two displays in response to the touch input releasing the application 600. A mask may be applied to the rendering of the application in a location corresponding to the seam 610 when the application is unrolled. Such a mask may facilitate the display of the spread application (and application windows that move across the seam) to appear more natural than splitting the complete image between the two displays. However, this mask also results in small areas of the displayed application being not viewable. Thus, the user may desire to reveal content obscured by seams. Fig. 6B illustrates example interactions at 614 and 616, where a user moves the application 600 (e.g., by touching and dragging) to reveal content hidden by the seam 610. Any suitable touch input may trigger the application's offset to reveal the content behind seam 610. As an example, the touch input may slide left or right from a seam, from a seam corner, etc., touch and drag, or perform a multi-finger gesture in a dual portrait mode. The same touch input is applicable except in the seam up or down direction in the dual-landscape mode. At 614, the user interface shows the touch input moving the application to the left and at 616 the user interface shows the touch input moving the application to the right. The expanded application can thereby be moved in the direction of the touch input to reveal at least a portion of the application content that is hidden behind the seam. In either example, the user may then move the application back to the original expanded mode shown at 612, such as by tapping on the revealed arrow user interface controls 618 or 620 (shown at 614 and 616) or by dragging back toward the seam and releasing the application. In other examples, this mask may not be applied, and the application may instead be split into two parts that together make up the entire image. In other examples, such a seam between multiple displays may not be present, or the seam may also be configured to display content (e.g., via a flexible display over a hinge).
In some examples, the touch input may move the application while zooming the application. 7-9 illustrate examples of a displayed application dragging the zoom of the application in response to user touch input. As shown, the origin of the applied zoom may be dynamically set by the horizontal touch position. For example, FIG. 7 shows touch input 700 starting at the lower right corner of the application, FIG. 8 shows touch input 800 starting at the middle bottom of the application, and FIG. 9 shows touch input 900 starting at the lower left corner of the application. In these examples, the zoom is decreased based on the vertical movement distance from the bottom of the display, and the origin of the movement is based on the initial touch location of the touch input. The dynamic positioning of the zoomed origin point can help create the effect that the application follows the user's finger or stylus. Automatic zooming that occurs with the movement of the application can help to better indicate to the user that the application is currently being moved while keeping the content of the application visible. Further, once released, the expansion of the application back to its original size may also help to better indicate to the user that the application is no longer being moved by user input. Automatic zooming in and out when moving an application may further help provide a more seamless transition of the application to a recent list of applications or a task switcher (as examples) for switching between applications. For example, a preview of the recent application list may be triggered when the gesture speed for the mobile application falls within a threshold speed near zero.
In some examples, the application may be moved in response to horizontal movement and zoomed in response to vertical movement in the portrait pose, and the application may be moved in response to vertical movement and zoomed in response to horizontal movement in the landscape pose. Further, the ratio of movement of the touch input to movement applied on the user interface may vary as a function of touch position. FIG. 10 shows a plot on a display of an example vertical movement of an application as a function of touch position in the Y direction when the display device is in a portrait gesture. In this example, from pixel 720 to the Y position of pixel 400, the applied position follows the user's touch at a 1:1 ratio. Then, from the touch Y position of pixels 400 to 100, the change in the application Y position from the change in the touch position changes from pixels 400 to 200 at a ratio of less than 1: 1. From pixel 100 to the touch Y position of 0, the application Y position varies linearly at an even smaller ratio than the touch position. In other examples, any other suitable mapping of touch movement to application movement may be used.
Likewise, the application zoom may also vary as a function of touch position. FIG. 11 shows a plot of an example application zoom as a function of touch position in the Y direction. From pixel 720 to the touch Y position of 400, the application may scale linearly from 100% to a 30% scale at a first rate; from the touch Y position of pixels 400 through 300, the application may scale linearly, but at a lesser rate, from 30% to 20% scale; and from touch Y locations smaller than pixel 300, the scale of the application can be kept at 20% and thus not further scaled down. In the depicted example, movement of the application in the horizontal direction while in portrait mode may follow the user's touch at all X positions at a 1:1 ratio, and horizontal movement may not result in application of zooming (as an example). Application movement in the landscape mode may also display movement rate and/or zoom adjustments.
In the dual portrait and dual landscape gestures, the movement of the application from one display to the other may be accompanied by prompt images that are displayed to indicate how the application may move or act based on touch input. Fig. 12 and 13 show example hint images 1200 and 1300, respectively, that can be displayed when moving an application in a double portrait and double landscape pose. The illustrated example hint image indicates that if the touch input currently dragging the application is released at the current location, the application will be positioned or moved to the display on which the hint image is displayed.
14-17 illustrate example hint images displayed based on touch input moving an application to within an example threshold distance. In these figures, the transition region 14 occupies a portion of the first display and a portion of the second display. FIG. 14 shows an example in which a touch input drags an application from the second display to the first display. In this figure, the first portion 144 of the application, as defined from the left edge of the application to example line a, has not yet passed the first boundary 145 of the transition zone 14 and has not yet displayed a prompt image, indicating that if the touch input releases the application at that location, the application will be displayed again at the original location without any movement.
Fig. 15 shows that the application has been moved far enough so that the first portion 144 of the application has crossed the first boundary 145 of the transition zone 14. In response, a prompt image is displayed on the first display indicating that the application is to be moved to the first display upon release. Fig. 16 shows that the application has been moved such that a larger second portion 146 of the application, as defined from the left edge of the application to example line B, passes through a first boundary 145 of the transition region 14, and in response a prompt image is displayed across the two displays indicating that the application will expand across the two displays upon release. Fig. 17 shows that the application has been moved such that a larger second portion 146 of the application passes a second boundary 147 of the transition zone 14, and in response a prompt image is displayed on the first display indicating that the application will move to the first display upon release. Thus, there may be at least one predetermined area to which an application can be moved to trigger the same action.
FIG. 18 illustrates an example hint image displayed in response to a touch input moving an application to a display currently occupied by another application. In this example, the touch input moves the application from the second display to the first display, but the first display is already occupied by another application. Upon dragging an application a certain threshold distance (such as the distances shown in fig. 15 and 17) or to a certain predetermined area, the other application on the left may zoom out slightly and grayed out as a hint image, indicating that the dragged application will be overlaid on top of the other application upon release and may, in some examples, cause the other application to close.
FIG. 19 shows an example hint image displayed in response to a touch input for an exchange application. Here, the application has been dragged such that the first portion 144 of the application has passed the example swap threshold line labeled line C. Alternatively, the application may be dragged such that the distal edge 1902 of the application passes through the first boundary 145 into the transition zone 14. In response to either, a hint image is displayed that shows the second application to the left on a smaller scale and shows the second application as a grayed-out image on the second display, indicating that the application to the right will be swapped with the application to the left upon release. It will be appreciated that in some examples, the reminder image may not be displayed, but the same action may be applied upon release within the predetermined area, regardless of whether the reminder image is displayed. Moreover, other suitable hint images may be used in other examples.
As described above, in some examples, a first application is superimposed over a second application when the first application moves from a first display to a second display that is already occupied by the second application. In some such examples, the second application may remain behind the first application in the suspended state indefinitely until the first application is removed or dismissed. In other such examples, the second application may be automatically dismissed. In still other examples, the second application may remain superimposed after the first application for a predetermined threshold amount of time, after which the second application may be dismissed and may then be accessed from the recent applications list. Within the threshold amount of time, the user may undo the overlay to reveal and restore the second application again. FIG. 20 illustrates an example user interface when an application is overlaid and then removed within a threshold time. At 2002, application 1 is on the left display and application 2 is on the right display. At 2004, the touch input moves app 2 to the left display onto app 1, and app 1 is shown smaller and transparent as an example cueing image indicating the overlay and possible app dismissal. At 2006, application 2 has been superimposed over application 1. However, before the threshold amount of time has elapsed, the touch input moves application 2 back to the right display, as shown at 2008, thereby revealing application 1 before application 1 is dismissed. After the threshold amount of time, the dismissed application may again be restored from, for example, the recent applications list. Any suitable threshold time may be used. In some examples, the threshold time may be between five and twenty seconds, and in a more specific example may be ten seconds. Further, in some examples, an application that has become overlaid under another application may instead be automatically transferred (immediately or after a predetermined threshold time) to the recent applications list for retrieval by the user.
In some examples, the user may trigger the launching of a new application from a currently open application or a new instance of the same application. When in the dual portrait or dual landscape mode, a new task may be automatically launched from a first application for display on a display opposite the first application if the display is unoccupied.
In some cases, the user may attempt to launch a new task from the currently open application when the relative display is already occupied. FIG. 21 shows an example user interface depicting launching a new task from a link in an application when another application is opened on another display. At 2102, the left display is occupied by application 1 containing the link, and the right display is occupied by application 2. At 2104, the user clicks on the link, which launches a new application, application 3. Because the adjacent display is occupied, the new application 3 opens on the same display as application 1, overlays application 1 and suspends application 1, as shown at 2106. When application 3 launches and expands to overlay the left display, application 1 may be displayed as if it is backing up and/or zooming out. At 2108, application 3 has been superimposed over application 1. In this case, the overlay may not result in automatic disarming of application 1. Application 1 may be reduced, for example, by: the user retrieves from the recent applications list (as described above) or the user touch input dismisses or moves the application 3 to the adjacent display.
Similarly, moving and deploying applications may behave differently depending on whether the display(s) are already occupied. 22-26 illustrate various examples of expanding, overlaying, swapping, and dismissing applications in response to movement of one application when one or more displays have been occupied by one or more other applications. FIG. 22 shows an example user interface that depicts expanding an application above another application and then contracting the expanded application to thereby restore the application below the expanded application. At 2202, app 1 is displayed on the first display and app 2 is displayed on the second display. At 2204, the user touch input moves app 1 to the seam and releases to spread app 1 across the two displays, superimposing app 1 over app 2 on the second display, as shown at 2206. At 2208, the user touch input contracts app 1 and moves app 1 back to the first display. At 2210, application 2 is exposed and restored.
Fig. 23 shows the expansion and contraction of an application when another application is opened on another display, and also shows different results resulting from contraction of the application 1 to a different screen. At 2302, application 1 is displayed on a first display and application 2 is displayed on a second display. At 2304, application 1 is shown unrolled across the two displays, e.g., after touch input drags application 1 and releases application 1 near the seam, thereby superimposing at least a portion of application 1 over application 2 on the second display. The user may shrink application 1 back to the first display, thereby restoring application 2 on the second display, as shown at 2306. Alternatively, the user may shrink application 1 to the second display. Application 1 may be automatically swapped with application 2 such that application 2 is displayed on the first display, as shown at 2308, rather than superimposing application 1 over application 2. In other examples, the swap may be triggered when the expansion application is contracted by moving the expansion application to a "swap threshold region" within the outer edge of the display, such as shown in FIG. 19. This may result in a permanent overlap if the collapsed application is released outside of the swap threshold region.
FIG. 24 illustrates expanding and contracting an application when two other applications are overlaid on another display. At 2402, app 1 is displayed on the first display and app 2 is displayed on the second display while being overlaid on app 3. At 2404, the user touch input moves the application to the seam, and upon release application 1 unfolds across the two displays, overlaying application 2 and application 3 on the second display, as shown at 2406. At 2408, the user enters the shrink-expanded application 1, but the result is different depending on which side the application 1 is shrunk to. At 2410, application 1 is shown shrinking to the first display, thereby restoring the original arrangement. However, as shown at 2412, when the expanded application is collapsed on the second display where the overlaid applications are initially located, the overlaid application exchanges locations with the expanded application. As described above, in other examples, a swap can be triggered when the expansion application is collapsed to a "swap threshold area" within the outer edge of the display, and an overlay can be generated when the expansion application is released outside of the swap threshold area.
Fig. 25 illustrates expanding and contracting an application when a second application is superimposed behind the application and a third application is opened on another display. At 2502, application 1 is superimposed over application 2 on the first display, and application 3 is displayed on the second display. At 2504, the user touch input moves application 1 to the seam. Application 1 thus expands across the two displays, overlaying application 2 on the first display and application 3 on the second display, as shown at 2506. When the expanded application 2508 is collapsed, the application can be overlaid on top of the second application or the third application, depending on which side the application is collapsed to. At 2510, application 1 is shrunk to the first display, thereby restoring application 3 on the second display. At 2512, app 1 is collapsed to the second display, revealing app 2 on the first display while app 1 is superimposed over app 3 on the second display.
FIG. 26 illustrates expanding and contracting a second application when the application is overlaid behind the application and two other applications are overlaid on another display. At 2602, app 1 is superimposed over app 2 on the first display and app 3 is superimposed over app 4 on the second display. At 2604, the user touch input moves application 1 towards the seam. When app 1 is deployed, app 1 remains superimposed over app 2 on the left display, and also over app 3 and app 4 on the right display, as shown at 2606. At 2608, the user touches the input and then contracts application 1. When the user retracts application 1 to the left display, application 1 is overlaid back over application 2 on the left display at 2610, while application 3 and application 4 remain overlaid on the right display. In contrast, when the user shrinks application 1 on the right display, application 1 is superimposed over application 3, as shown at 2612, thereby dismissing application 4. This may occur, for example, if the settings specify that only two applications can be superimposed on the display at any time (rather than when expanded). In other examples, each display may overlay more than two applications at a time.
FIG. 27 shows an example method 2700 of operating a dual-screen computing device. The method 2700 includes, at 2702, displaying the first application on a first display of the dual-screen computing device and, at 2704, displaying the second application on a second display of the dual-screen computing device. The method 2700 further includes, at 2706, receiving a touch input at the first display that opens a third application while the first application is displayed on the first display, e.g., opens from a link within the first application. At 2708, the method 2700 includes displaying the third application as superimposed over the first application. At 2710, method 2700 includes receiving a touch input to close the third application, and in response, closing the third application and displaying the first application on the first display. The computing device may open a new application from the currently running applications based on whether any applications have been opened on either the first display or the second display.
28A-E illustrate another example method 2800 of operating a dual screen computing device, and more particularly, various methods of displaying an application based on the application being moved to a different predetermined area of a display. Method 2800 includes, at 2802, receiving, at a first display, a touch input dragging a first application from the first display to a second display. In some examples, this may include receiving the touch input at an application navigation bar (e.g., displayed on the bottom, side, or other suitable location on the display, as shown in fig. 6). At 2804, a size of the first application may be scaled based at least in part on a direction of movement of the touch input as the first application is dragged. At 2806, method 2800 includes displaying a prompt indicating where the application is to move based at least in part on the touch input dragging the first application into the predetermined area. For example, a hint image indicating that the first application is to be expanded, moved, swapped, or overlaid may be displayed that indicates that the first application dragged the movement result of the first application based on the touch input if the touch input is complete.
From 2802, the method 2800 may continue to 2808 of fig. 28A or to fig. 28C, 28D, or 28E, which illustrate different results depending on where the first application was released.
For example, method 2800 may include, at 2808, detecting that the touch input releases the first application within a first predetermined area. This may include, at 2810, detecting that the touch input releases the first application within a threshold distance of a device seam in a hinge region of the computing device. In other examples, such a seam may not be present. At 2812, the method 2800 includes expanding the application across the first display and the second display such that a portion of the application content is hidden behind the seam. Unfolding may include applying a mask to the rendering of the first application in a location corresponding to a seam between the first display and the second display, at 2824. In other examples, the image of the first application may be divided such that the full image is split between the two displays.
Further, expanding can include, at 2816, superimposing a portion of the expanded first application over a second application on a second display, wherein the second application is open on the second display.
Continuing with FIG. 28B, the method 2800 further includes, at 2818, receiving a touch input to move the expanded first application. The method 2800 further includes, at 2820, moving the expanded application in a direction of the touch input to reveal at least a portion of the application content that is hidden behind the seam. Further, the method 2800 includes, at 2822, receiving a touch input to move and collapse the expanded application to either of the first display and the second display, and displaying the first application on one of the first display and the second display based on the touch input. For example, a drag-and-drop touch input or swipe (fling) gesture can collapse the first application to the first display, and in response the first application can be moved to the first display. In the event that the first application is overlaid over the second application, moving the first application may reveal the second application, as shown at 2824. As another example, the touch input may pinch the first application onto the second display such that the first application remains superimposed over the second application, as shown at 2826, thereby possibly causing, in some examples, a closing of the second application, e.g., immediately or after a threshold amount of time. As yet another example, the touch input may shrink the first application onto the second display such that the first application exchanges locations with the second application, rather than being overlaid over the second application, as shown at 2828.
In the event that the touch input is released in a different area than the first predefined area, a different result may occur than the unfolding. For example, fig. 28C shows that detecting a touch input releases the first application within the second predefined area at 2736. In response, the first application is moved to the second display, as shown at 2832, rather than being spread across both displays. This action may also be due to a swipe gesture in which the touch input moves from one of the first and second displays to the other of the first and second displays at a speed greater than a threshold speed and/or an acceleration greater than a threshold acceleration, also shown as 2830. Moving the first application to the second display may include, at 2834, overlaying the first application over the second application on the second display. In some examples, the computing device may close the second application after a threshold amount of time that the first application is overlaid over the second application, as shown at 2836. This may allow the user to move the overlaid first application back to the first display to reveal the second application on the second display before the second application is closed. Thus, the method 2800 further includes, at 2838, receiving a touch input moving the first application back to the first display and, in response, displaying the first application on the first display. In the case where the first application is superimposed over the second application and moved within the threshold amount of time, this may also include displaying the second application on the second display.
In another example response to the touch input received at 2802, fig. 28D shows that at 2842, detecting the touch input releases the first application within the third predefined area and exchanges the first application on the first display with the second application on the second display. As yet another example response to the touch gesture of 2802, fig. 28E shows that at 2842, detecting the touch input releases the first application within the fourth predefined area, and in response closes the first application. The touch input may drag and drop the first application in any other suitable predetermined area to trigger other corresponding actions, such as moving the application to a recent applications list, and so on.
29A-B illustrate example user interfaces depicting interaction with an application folder in a dual portrait mode of the dual-screen device 100. At 2902, the user interface shows the stationary application bar 2904 on the first display 106 of the first portion 102 and the stationary application bar 2906 on the second display 108 of the second portion 104 the user interface also shows at 2902 the touch input 2908 tapping on the application folder 2910 on the first display 106. At 2912, the application folder 2910 is opened on the first display 106. The open application folder 2910 may be at least partially transparent so that the content behind it remains visible. In addition, the stationary application on the stationary application bar 2904 on the first display 106 is shifted to the stationary application bar 2906 on the second display 108. In some examples, the shift may be animated such that the user sees a transition that is fixedly applied to another display. It will be appreciated that shifting of stationary applications can occur if an application or system component (e.g., a folder) occupies either screen in dual display mode. At 2914, the touch input 2916 performed on the second display 108 is dragged from left to right on the second display 108, which moves the item displayed behind the application folder 2910 on the first display 106 to the second display 108. At any time, the user may close the application folder 2910, for example, by tapping the "X" icon shown in the folder.
Continuing with fig. 29B, at 2918, the application folder 2910 is opened on the second display 108, such as after a touch input to move the application folder from the first display 106 to the second display 108, or after the touch input has closed the application folder 2910 on the first display 106 and reopened the application folder 2910 on the second display 108. The stationary application previously on the stationary application bar 2906 on the second display 108 is shifted to the stationary application bar 2904 on the first display 106. Similarly, the touch input 2920 may be performed from right to left to shift the content behind the application folder 2910 on the second display 108 to the first display 106, as similarly shown at 2914. At 2924, when the application folder 2910 is opened, the touch input 2926 may move an application 2928 that is not currently in the application folder 2910 into the folder 2910. It will be understood that a stationary application may also be displaced in response to opening an application on one display and is not limited to opening of an application folder. This behavior may allow stationary applications to remain visible to the user.
In some examples, when no application or system component (e.g., folder) is open on first display 106 or second display 108, the user may also shift a stationary application that was on stationary application bar 2904 on first display 106 to stationary application bar 2906 on second display 108 through a rightward swipe/swipe gesture within a threshold area of stationary application bar 2904 or stationary application bar 2906. Similarly, the user may perform a leftward swipe gesture on fixed application bar 2906 or fixed application bar 2904 to shift fixed applications previously on fixed application bar 2906 on second display 108 to fixed application bar 2904 on first display 106. This gives the user control of whichever stationary application is launched on the preferred display. It will be appreciated that any of the other above-described example windowing gestures, including those described above, are also applicable to double-landscape gestures and that the gesture directions described are relative.
Fig. 30 illustrates an example method 3000 of operating a dual-screen computing device. Method 3000 includes, at 3002, displaying one or more applications in a pinned application bar on each of the first display and the second display. Method 3000 further includes, at 3004, receiving a touch input to open an application or folder on one of the first display and the second display. In response, method 3000 includes displaying 3006 the open application or folder on one of the first display and the second display and shifting the stationary application on one of the first display and the second display to the other of the first display and the second display. For example, if a touch input opens an application on a first display, a stationary application in a stationary application bar on the first display may be shifted to a stationary application bar of a second display. Likewise, if the touch input opens an application on the second display, a stationary application in the stationary application bar on the second display may be shifted to the stationary application bar of the first display. It will be appreciated that this may also apply when opening an application folder, as described with reference to fig. 29A-B.
It will be understood that the various user interface examples described herein may be applicable to any suitable multi-display system, including display systems other than mobile phones, such as multi-monitor display systems for desktop computers, surface-mounted multi-displays, virtual and/or augmented reality display systems, heads-up displays, projection display systems, and the like. Further, although described above in the context of touch inputs, it will be understood that any of the above-described touch inputs and gestures may also be input via a suitable input device (e.g., a mouse controlling a cursor).
In some embodiments, the methods and processes described herein may be bound to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as computer applications or services, Application Programming Interfaces (APIs), libraries, and/or other computer program products.
FIG. 31 schematically illustrates a non-limiting embodiment of a computing system 3100 that can perform one or more of the methods and processes described above. The computing system 3100 is shown in simplified form. The computing system 3100 may include the computing device 100 described above and illustrated in fig. 1. The computing system 3100 may take the form of: one or more personal computers, server computers, tablet computers, home entertainment computers, network computing devices, gaming devices, mobile computing devices, mobile communication devices (e.g., smartphones), and/or other computing devices, as well as wearable computing devices such as smart watches and head-mounted augmented reality devices.
The computing system 3100 includes a logic device 3102 and a non-volatile storage device 3104. Computing system 3100 may optionally include a display subsystem 3106, an input subsystem 3108, a communication subsystem 3110, and/or other components not shown in fig. 31.
Logical devices 3102 include one or more physical devices configured to execute instructions. For example, logic device 3102 may be configured to execute instructions that are part of one or more applications, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, implement a technical effect, or otherwise achieve a desired result.
The logic device 3102 may include one or more physical processors (hardware) configured to execute software instructions. Additionally or alternatively, the logical processor 3102 may include one or more hardware logic circuits or firmware devices configured to execute hardware-implemented logic or firmware instructions. The processors of logic device 3102 may be single-core or multi-core, and the instructions executed thereon may be configured for serial, parallel, and/or distributed processing. Individual components of logic device 3102 may optionally be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic device 3102 may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration. It will be appreciated that in such a scenario, these virtualization aspects run on different physical logical processors of a variety of different machines.
Storage device 3104 may be a non-volatile storage device. The non-volatile storage devices 3104 include one or more physical devices configured to hold instructions executable by the logic device 3102 to implement the methods and processes described herein. When such methods and processes are implemented, the state of non-volatile storage device 3104 may be transformed — e.g., to hold different data.
The non-volatile storage devices 3104 may include removable and/or built-in devices. The non-volatile storage 3104 may include optical memory (e.g., CD, DVD, HD-DVD, blu-ray disc, etc.), semiconductor memory (e.g., ROM, EPROM, EEPROM, flash memory, etc.), and/or magnetic memory (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.) or other mass storage device technology. The non-volatile storage devices 3104 may include non-volatile, dynamic, static, read/write, read-only, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. It will be appreciated that the storage device 3104 is configured to save instructions even when the power to the storage device 3104 is cut off.
In other examples, storage device 3104 may comprise volatile memory, which may comprise a physical device including random access memory. Volatile memory is typically used by the logic device 3102 to temporarily store information during processing of software instructions. It will be appreciated that volatile memory typically does not continue to store instructions when power is removed from the volatile memory.
Aspects of the logic device 3102 and/or the storage device 3104 may be integrated together into one or more hardware logic components. Such hardware logic components may include, for example, Field Programmable Gate Arrays (FPGAs), program and application specific integrated circuits (PASIC/ASIC), program and application specific standard products (PSSP/ASSP), systems on a chip (SOC), and Complex Programmable Logic Devices (CPLDs).
When included, display subsystem 3106 may be used to present a visual representation of data held by storage device 3104. The visual representation may take the form of a Graphical User Interface (GUI). Because the methods and processes described herein change the data held by the non-volatile storage device, and thus transform the state of the non-volatile storage device, the state of display subsystem 3106 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 3106 may include one or more display devices utilizing virtually any type of technology. Such display devices may be incorporated in a shared enclosure with the logic device 3102, volatile memory and/or non-volatile storage device 3104, or such display devices may be peripheral display devices.
When included, input subsystem 3108 may include or interface with one or more user input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may include or interface with selected Natural User Input (NUI) components. Such components may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on-board or off-board. Example NUI components may include a microphone for speech and/or voice recognition; infrared, color, stereo display and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer and/or gyroscope for motion detection and/or intent recognition; and an electric field sensing component for assessing brain activity; and/or any other suitable sensor.
When included, the communication subsystem 3110 may be configured to communicatively couple various computing devices described herein with each other and with other devices. Communication subsystem 3110 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network or a wired or wireless local or wide area network (such as HDMI over a Wi-Fi connection). In some embodiments, the communication subsystem may allow computing system 3100 to send and/or receive messages to and/or from other devices via a network such as the internet.
Another example provides a computing device comprising:
a first portion including a first display and a first touch sensor; and
a second portion including a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display.
The computing device further includes a logic device and a storage device holding instructions executable by the logic device to:
receiving, at the first display, a touch input moving an application currently displayed on the first display but not the second display to the second display;
detecting that the touch input releases the application within a predetermined area;
expanding the application across the first display and the second display such that a portion of application content is hidden behind the seam;
receiving a touch input of a mobile expansion application; and
moving the expanded application in a direction in which the touch input moves the expanded application to reveal at least a portion of the application content that is hidden behind the seam.
The instructions are additionally or alternatively executable to expand the application across the first display and the second display by rendering an application mask to the displayed application in a location corresponding to the seam.
The instructions are additionally or alternatively executable to:
receiving a touch input to move the expanded application to one of the first display and the second display; and
displaying the expanded application on one of the first display and the second display.
The instructions are additionally or alternatively executable to:
displaying one or more applications in a pinned application bar on each of the first display and the second display;
receiving a touch input to open an application folder on one of the first display and the second display, and in response, displaying the application folder on the one of the first display and the second display; and
shifting an application on the one of the first display and the second display to the other of the first display and the second display.
The predefined region may be a first predefined region, and the instructions may additionally or alternatively be executable to:
detecting a touch input moving the application to the second display and releasing the application within a second predefined area; and
moving the application to the second display.
The application may be a first application, and the instructions may additionally or alternatively be executable to:
displaying a second application on the second display;
upon detecting that the touch input releases the application within the second predefined area, overlay the first application over the second application on the second display.
The instructions are additionally or alternatively executable to close the second application after a threshold amount of time that the first application is overlaid over the second application.
The application may be a first application, and the instructions may additionally or alternatively be executable to:
displaying a second application on the second display;
detecting that a touch input releases the application within a second predefined area; and
swapping the first application on the first display with the second application on the second display.
The application may be a first application, and the instructions may additionally or alternatively be executable to:
displaying a second application on the second display, wherein expanding the first application across the first display and the second display comprises overlaying a portion of a first application over the second application on the second display;
receiving a touch input that shrinks the first application; and
displaying the first application on the first display and the second application on the second display.
The predefined region may be a first predefined region, and the instructions may additionally or alternatively be executable to:
detecting that a touch input releases the application within a second predefined area; and
and closing the application.
The instructions are additionally or alternatively executable to:
detecting a touch input comprising a swipe gesture; and
moving the application to the second display.
The instructions are additionally or alternatively executable to:
scaling a size of the application based at least in part on a direction of movement of the touch input.
The instructions may additionally or alternatively be executable to:
prior to detecting that the touch input releases the application, displaying a prompt indicating that the application is to be deployed based at least in part on the touch input moving the application into the predetermined area.
Another example provides a computing device comprising:
a first portion including a first display and a first touch sensor;
a second portion comprising a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display;
a logic device; and
a storage device holding instructions executable by the logic machine to:
receiving, at the first display, a touch input moving an application from the first display to the second display;
moving the application to the second display when the touch input releases the application within a first predefined area; and
when the touch input releases the application within a second predefined area, expand the application by displaying the application across the first display and the second display.
The application may be a first application, and the instructions may additionally or alternatively be executable to:
displaying a second application on the second display; and
swapping the first application on the first display with the second application on the second display by displaying the first application on the second display and the second application on the first display when the user input releases the first application in a third predefined area.
The application may be a first application, and the instructions may additionally or alternatively be executable to:
displaying a second application on the second display;
receiving, at the first display, a touch input to open a third application from the first application while the first application is displayed on the first display; and
displaying the third application as superimposed over the first application.
The instructions are additionally or alternatively executable to receive a touch input to close the third application and display the first application on the first display.
Another example provides a method implemented on a computing device that includes a first portion including a first display and a first touch sensor and a second portion including a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display, the method comprising:
displaying a first application on the first display;
displaying a second application on the second display;
receiving, at the first display, a touch input moving the first application to the second display;
detecting that the touch input releases the application within a predetermined area; and
releasing the application within the predetermined area based at least on the touch input to overlay the first application over the second application on the second display.
The method can additionally or alternatively comprise:
receiving a touch input moving the first application to the first display; and
in response, the first application is displayed on the first display and the second application is displayed on the second display.
The method can additionally or alternatively include closing the second application after a threshold amount of time that the first application is overlaid over the second application.
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Also, the order of the processes described above may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (15)

1. A computing device, comprising:
a first portion including a first display and a first touch sensor;
a second portion comprising a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display;
a logic device; and
a storage device holding instructions executable by the logic device to:
receiving, at the first display, a touch input moving an application currently displayed on the first display but not the second display to the second display;
detecting that the touch input releases the application within a predetermined area;
expanding the application across the first display and the second display such that a portion of application content is hidden behind the seam;
receiving a touch input of a mobile expansion application; and
moving the expanded application in a direction in which the touch input moves the expanded application to reveal at least a portion of the application content that is hidden behind the seam.
2. The computing device of claim 1, wherein the instructions are executable to expand the application across the first display and the second display by applying a mask to a rendering of the displayed application in a location corresponding to the seam.
3. The computing device of claim 1, wherein the instructions are further executable to receive a touch input moving the expanded application to one of the first display and the second display, and display the expanded application on the one of the first display and the second display.
4. The computing device of claim 1, wherein the instructions are executable to:
displaying one or more stationary applications in a stationary application bar on each of the first display and the second display;
receiving a touch input to open an application on one of the first display and the second display, and in response, displaying the opened application on the one of the first display and the second display; and
shifting a stationary application on the one of the first display and the second display to the other of the first display and the second display.
5. The computing device of claim 1, wherein the predefined area is a first predefined area, and wherein the instructions are further executable to detect that a touch input moves the application to the second display and releases within a second predefined area, and move the application to the second display.
6. The computing device of claim 5, wherein the application is a first application, and wherein the instructions are further executable to:
displaying a second application on the second display;
upon detecting that the touch input releases the application within the second predefined area, overlay the first application over the second application on the second display.
7. The computing device of claim 6, wherein the instructions are further executable to close the second application after a threshold amount of time that the first application is overlaid over the second application.
8. The computing device of claim 1, wherein the application is a first application, and wherein the instructions are further executable to:
displaying a second application on the second display;
detecting that a touch input releases the application within a second predefined area; and
swapping the first application on the first display with the second application on the second display.
9. The computing device of claim 1, wherein the application is a first application, and wherein the instructions are further executable to:
displaying a second application on the second display, wherein expanding the first application across the first display and the second display comprises overlaying a portion of a first application over the second application on the second display;
receiving a touch input that shrinks the first application; and
displaying the first application on the first display and the second application on the second display.
10. The computing device of claim 1, wherein the predefined area is a first predefined area, and wherein the instructions are further executable to detect that touch input releases the application within a second predefined area and close the application.
11. The computing device of claim 1, wherein the instructions are executable to detect a touch input comprising a swipe gesture and move the application to the second display.
12. The computing device of claim 1, wherein the instructions are executable to scale a size of the application based at least in part on a direction of movement of the touch input.
13. The computing device of claim 1, wherein the instructions are executable to display a prompt prior to detecting that the touch input releases the application, the prompt indicating that the application is to be deployed based at least in part on the touch input moving the application into the predetermined area.
14. A method implemented on a computing device comprising a first portion including a first display and a first touch sensor and a second portion including a second display and a second touch sensor, the second portion connected to the first portion via a hinge, the hinge defining a seam between the first display and the second display, the method comprising:
displaying a first application on the first display;
displaying a second application on the second display;
receiving, at the first display, a touch input moving the first application to the second display;
detecting that the touch input releases the application within a predetermined area; and
releasing the application within the predetermined area based at least on the touch input to overlay the first application over the second application on the second display.
15. The method of claim 14, further comprising:
receiving a touch input moving the first application to the first display; and
in response, the first application is displayed on the first display and the second application is displayed on the second display.
CN202080069791.7A 2019-10-01 2020-09-02 Mobile applications on multi-screen computing devices Pending CN114585996A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201962909191P 2019-10-01 2019-10-01
US62/909,191 2019-10-01
US16/717,988 US11416130B2 (en) 2019-10-01 2019-12-17 Moving applications on multi-screen computing device
US16/717,988 2019-12-17
PCT/US2020/048965 WO2021066988A1 (en) 2019-10-01 2020-09-02 Moving applications on multi-screen computing device

Publications (1)

Publication Number Publication Date
CN114585996A true CN114585996A (en) 2022-06-03

Family

ID=75161828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080069791.7A Pending CN114585996A (en) 2019-10-01 2020-09-02 Mobile applications on multi-screen computing devices

Country Status (4)

Country Link
US (2) US11416130B2 (en)
EP (1) EP4038472A1 (en)
CN (1) CN114585996A (en)
WO (1) WO2021066988A1 (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3451123B8 (en) * 2010-09-24 2020-06-17 BlackBerry Limited Method for conserving power on a portable electronic device and a portable electronic device configured for the same
US11561587B2 (en) 2019-10-01 2023-01-24 Microsoft Technology Licensing, Llc Camera and flashlight operation in hinged device
US11201962B2 (en) 2019-10-01 2021-12-14 Microsoft Technology Licensing, Llc Calling on a multi-display device
US11893177B2 (en) 2020-11-08 2024-02-06 Lepton Computing Llc Flexible display device haptics
US20220212096A1 (en) * 2020-11-30 2022-07-07 Lepton Computing Llc Gaming Motion Control Interface Using Foldable Device Mechanics
US11817065B2 (en) * 2021-05-19 2023-11-14 Apple Inc. Methods for color or luminance compensation based on view location in foldable displays
USD1003932S1 (en) * 2021-07-23 2023-11-07 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD1001155S1 (en) 2021-07-23 2023-10-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD1000473S1 (en) * 2021-07-23 2023-10-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD1012947S1 (en) * 2021-07-23 2024-01-30 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US11635817B1 (en) 2021-10-07 2023-04-25 Microsoft Technology Licensing, Llc Stylus haptic component
CN116088734A (en) * 2021-11-05 2023-05-09 北京小米移动软件有限公司 Window adjustment method, device, electronic equipment and computer readable storage medium
US11907516B2 (en) * 2022-04-16 2024-02-20 Samsung Electronics Co., Ltd. Electronic device, method, and non-transitory computer readable storage medium for identifying set of information according to change in size of display area of flexible display
US11947860B2 (en) * 2022-05-20 2024-04-02 Microsoft Technology Licensing, Llc Mapping incompatible windowing topographies across operating systems

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060107226A1 (en) * 2004-11-16 2006-05-18 Microsoft Corporation Sidebar autohide to desktop
WO2010028406A1 (en) * 2008-09-08 2010-03-11 Qualcomm Incorporated Method for indicating location and direction of a graphical user interface element
US20120084720A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Managing expose views in dual display communication devices
CN102937876A (en) * 2011-11-23 2013-02-20 微软公司 Dynamic scaling of a touch sensor
EP2674834A2 (en) * 2011-02-10 2013-12-18 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US20150370322A1 (en) * 2014-06-18 2015-12-24 Advanced Micro Devices, Inc. Method and apparatus for bezel mitigation with head tracking
US20160349974A1 (en) * 2015-06-01 2016-12-01 Apple Inc. Linking Multiple Windows in a User Interface Display

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3516328B2 (en) 1997-08-22 2004-04-05 株式会社日立製作所 Information communication terminal equipment
US6326946B1 (en) * 1998-09-17 2001-12-04 Xerox Corporation Operator icons for information collages
US7536650B1 (en) * 2003-02-25 2009-05-19 Robertson George G System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery
US8504936B2 (en) 2010-10-01 2013-08-06 Z124 Changing stack when swapping
US9207717B2 (en) 2010-10-01 2015-12-08 Z124 Dragging an application to a screen using the application manager
KR101217554B1 (en) 2006-05-09 2013-01-02 삼성전자주식회사 seamless foldable display device
US8890802B2 (en) 2008-06-10 2014-11-18 Intel Corporation Device with display position input
US8289287B2 (en) * 2008-12-30 2012-10-16 Nokia Corporation Method, apparatus and computer program product for providing a personalizable user interface
US8194001B2 (en) 2009-03-27 2012-06-05 Microsoft Corporation Mobile computer device display postures
US20100321275A1 (en) 2009-06-18 2010-12-23 Microsoft Corporation Multiple display computing device with position-based operating modes
US8548523B2 (en) 2009-07-01 2013-10-01 At&T Intellectual Property I, L.P. Methods, apparatus, and computer program products for changing ring method based on type of connected device
US20110143769A1 (en) 2009-12-16 2011-06-16 Microsoft Corporation Dual display mobile communication device
US9046992B2 (en) * 2010-10-01 2015-06-02 Z124 Gesture controls for multi-screen user interface
US9495012B2 (en) 2011-09-27 2016-11-15 Z124 Secondary single screen mode activation through user interface activation
EP2966854B1 (en) 2013-03-06 2020-08-26 Nec Corporation Imaging device, imaging method and program
US9524030B2 (en) 2013-04-26 2016-12-20 Immersion Corporation Haptic feedback for interactions with foldable-bendable displays
US20140351722A1 (en) * 2013-05-23 2014-11-27 Microsoft User interface elements for multiple displays
KR20150026403A (en) 2013-09-03 2015-03-11 삼성전자주식회사 Dual-monitoring system and method
US20150100914A1 (en) * 2013-10-04 2015-04-09 Samsung Electronics Co., Ltd. Gestures for multiple window operation
US9478124B2 (en) 2013-10-21 2016-10-25 I-Interactive Llc Remote control with enhanced touch surface input
US10021247B2 (en) 2013-11-14 2018-07-10 Wells Fargo Bank, N.A. Call center interface
KR102561200B1 (en) 2014-02-10 2023-07-28 삼성전자주식회사 User terminal device and method for displaying thereof
CN106250137A (en) * 2014-03-31 2016-12-21 青岛海信移动通信技术股份有限公司 A kind of method and apparatus of the process event being applied to Android platform
CN104239094B (en) * 2014-08-29 2017-12-08 小米科技有限责任公司 Control method, device and the terminal device of background application
US10291873B2 (en) 2015-11-20 2019-05-14 Hattar Tanin, LLC Dual-screen electronic devices
KR102480462B1 (en) * 2016-02-05 2022-12-23 삼성전자주식회사 Electronic device comprising multiple displays and method for controlling thereof
KR102558164B1 (en) 2016-08-30 2023-07-21 삼성전자 주식회사 Method for providing notification service related to the call back and an electronic device
US10346117B2 (en) * 2016-11-09 2019-07-09 Microsoft Technology Licensing, Llc Device having a screen region on a hinge coupled between other screen regions
CN106681641A (en) 2016-12-23 2017-05-17 珠海市魅族科技有限公司 Split screen display method and device
CN107040719A (en) 2017-03-21 2017-08-11 宇龙计算机通信科技(深圳)有限公司 Filming control method and imaging control device based on double screen terminal
US10567630B2 (en) 2017-05-12 2020-02-18 Microsoft Technology Licensing, Llc Image capture using a hinged device with multiple cameras
US10204592B1 (en) * 2017-11-30 2019-02-12 Dell Products L.P. Configuring multiple displays of a computing device to have a similar perceived appearance
US11169577B2 (en) 2018-04-04 2021-11-09 Microsoft Technology Licensing, Llc Sensing relative orientation of computing device portions
DK180316B1 (en) * 2018-06-03 2020-11-06 Apple Inc Devices and methods for interacting with an application switching user interface
KR102638783B1 (en) 2018-10-17 2024-02-22 삼성전자주식회사 Electronic device for controlling application according to folding angle and method thereof
US11201962B2 (en) 2019-10-01 2021-12-14 Microsoft Technology Licensing, Llc Calling on a multi-display device
US11561587B2 (en) 2019-10-01 2023-01-24 Microsoft Technology Licensing, Llc Camera and flashlight operation in hinged device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060107226A1 (en) * 2004-11-16 2006-05-18 Microsoft Corporation Sidebar autohide to desktop
WO2010028406A1 (en) * 2008-09-08 2010-03-11 Qualcomm Incorporated Method for indicating location and direction of a graphical user interface element
US20120084720A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Managing expose views in dual display communication devices
US20120084710A1 (en) * 2010-10-01 2012-04-05 Imerj, Llc Repositioning windows in the pop-up window
EP2674834A2 (en) * 2011-02-10 2013-12-18 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
CN102937876A (en) * 2011-11-23 2013-02-20 微软公司 Dynamic scaling of a touch sensor
US20150370322A1 (en) * 2014-06-18 2015-12-24 Advanced Micro Devices, Inc. Method and apparatus for bezel mitigation with head tracking
US20160349974A1 (en) * 2015-06-01 2016-12-01 Apple Inc. Linking Multiple Windows in a User Interface Display

Also Published As

Publication number Publication date
US20220391078A1 (en) 2022-12-08
EP4038472A1 (en) 2022-08-10
US11416130B2 (en) 2022-08-16
WO2021066988A1 (en) 2021-04-08
US20210096732A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
CN114585996A (en) Mobile applications on multi-screen computing devices
US11740755B2 (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
KR102027612B1 (en) Thumbnail-image selection of applications
US20220005387A1 (en) User interface transitions and optimizations for foldable computing devices
JP5130215B2 (en) Virtual magnifier with on-the-fly control
US9047004B2 (en) Interface element for manipulating displayed objects on a computer interface
US8413075B2 (en) Gesture movies
US9313406B2 (en) Display control apparatus having touch panel function, display control method, and storage medium
US20120032877A1 (en) Motion Driven Gestures For Customization In Augmented Reality Applications
ES2753438T3 (en) Method and apparatus for displaying content
JP2013542512A (en) Multi-screen user interface gesture control
US20130155108A1 (en) Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture
JP2011065644A (en) System for interaction with object in virtual environment
US8762840B1 (en) Elastic canvas visual effects in user interface
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
US9927973B2 (en) Electronic device for executing at least one application and method of controlling said electronic device
EP2696274A2 (en) Portable apparatus with a GUI and method of using the same
JP2017211925A (en) Information processing device and image display method
JP4912377B2 (en) Display device, display method, and program
JP2017534975A (en) Dialogue method for user interface
CN113076154A (en) Method and device for splitting screen under multiple applications and electronic equipment
WO2021066989A1 (en) Drag and drop operations on a touch screen display
US11520469B2 (en) Interface for multiple simultaneous interactive views
WO2014061095A1 (en) Information display device and operation control method in information display device
US10795543B2 (en) Arrangement of a stack of items based on a seed value and size value

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination