WO2024207090A1 - Foldable device hinge interaction - Google Patents

Foldable device hinge interaction Download PDF

Info

Publication number
WO2024207090A1
WO2024207090A1 PCT/CA2023/050465 CA2023050465W WO2024207090A1 WO 2024207090 A1 WO2024207090 A1 WO 2024207090A1 CA 2023050465 W CA2023050465 W CA 2023050465W WO 2024207090 A1 WO2024207090 A1 WO 2024207090A1
Authority
WO
WIPO (PCT)
Prior art keywords
foldable
fold region
touchscreen
overlays
shortcut
Prior art date
Application number
PCT/CA2023/050465
Other languages
French (fr)
Inventor
Graeme Bradford ZINCK
Daniel John Vogel
Che YAN
Roya Cody
Wei Li
Original Assignee
Huawei Technologies Canada Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Canada Co., Ltd. filed Critical Huawei Technologies Canada Co., Ltd.
Priority to PCT/CA2023/050465 priority Critical patent/WO2024207090A1/en
Publication of WO2024207090A1 publication Critical patent/WO2024207090A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • This disclosure relates generally to foldable touchscreen devices, and more specifically to systems and methods for providing functionality at the fold region of foldable, capacitive touchscreen devices.
  • Foldable touchscreen devices can be complex to use for multitasking, requiring the use of multiple controls and gestures to switch between applications (apps) and windows. Many foldable touchscreen devices struggle with windows management, making it difficult to switch between screens or to make windows full-screen.
  • a foldable touchscreen device which may be a laptop, touchpad, e-reader, or a foldable smartphone, for example, may include a flexible display.
  • a flexible display is a display that may be rolled without the displayed image or text being distorted.
  • a foldable touchscreen device may include a flexible display that spans the fold region, or “hinge”, as well as the touchscreen partitions on adjacent sides of the fold region. As a result, the fold region may receive touch input, and may allow the touchscreen device to be adaptable to different usage scenarios. While it is often overlooked by manufacturers, who may minimize the hinge radius to prioritize other features such as the size of the touchscreen partitions and/or the overall design of the device, the hinge is an important element that can enhance user experience.
  • the hinge can be useful for multitasking, as the adjacent touchscreen partitions (on either side of the hinge) can be used to display different applications or tasks side-by-side.
  • the hinge can also be used as a bridge between the two touchscreen partitions, allowing for seamless transitions between different applications or tasks.
  • the hinge has unique characteristics that can be leveraged for accessibility purposes. For example, because the hinge is a physical feature that can be easily navigated by touch, it can be used to provide users with visual impairments with a more intuitive way to interact with a foldable touchscreen device.
  • Embodiments of the present disclosure describe intuitive and novel interaction techniques for foldable touchscreen devices.
  • these techniques provide a dynamic portal of shortcuts that can adapt to user input and to system/application events, enabling the execution of frequently used tasks and commands without the need for navigating multiple menus.
  • Embodiments of the present disclosure may be used for various purposes, including windows management, shortcuts, and multitasking. In some examples, these functions may be facilitated more efficiently and effectively than current approaches.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen device further comprises a processor, and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display a first object in the first partition; detect, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detect a path of the drag gesture passing near or through a fust one of one or more shortcut overlays displayed in the fold region; and trigger a first action corresponding to the first one of the one or more shortcut overlays.
  • the foldable touchscreen device is further caused to, prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays: detect, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and display, at the fold region, the one or more shortcut overlays.
  • the one or more shortcut overlays is a windows management overlay.
  • the foldable touchscreen device is further caused to: detect a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, remove the one or more shortcut overlays from display in the fold region.
  • the one or more shortcut overlays is an app multiplier overlay.
  • the foldable touchscreen device is further caused to: detect the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and trigger a second action corresponding to the second one of the one or more shortcut overlays.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of an e-reader application: detect a drag gesture in a first direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and insert a virtual bookmark at a currently displayed location of an electronic document.
  • the drag gesture begins at or near a first edge of the fold region.
  • the foldable touchscreen device is further caused to: detect a drag gesture in a second direction along the fold region substantially opposite to the first direction; and remove the virtual bookmark.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display content using full screen mode; detect a drag gesture in a first direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and display the content using dual screen mode.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a first touch gesture at the fold region; and display a defined region containing application shortcuts.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of a particular application: display, at the fold region, one or more application control overlays defined by the particular application; detect a touch gesture at a location of the fold region corresponding to one of the one or more application control overlays; and trigger a first action corresponding to the one of the one or more application control overlays.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a grip posture of the foldable touchscreen device; in response to detecting the grip posture, display, at the fold region, a thumb input overlay; detect a touch gesture at a location of the fold region corresponding to the thumb input overlay; and trigger a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
  • the foldable touchscreen device is executing a video game application and the first action is a video game control action.
  • the foldable touchscreen device is executing a camera application and the first action is a camera control action.
  • a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions.
  • the touchscreen element is foldable at the fold region.
  • the foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display one or more user modifiable objects; receive a selection of one of the one or more user modifiable objects; display, at the fold region, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modify the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
  • a computer-implemented method comprising: displaying a fust object in a first partition of a flexible touchscreen element; detecting, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detecting a path of the drag gesture passing near or through a fold region of the flexible touchscreen element a first one of one or more shortcut overlays displayed in the fold region; and triggering a first action corresponding to the first one of the one or more shortcut overlays.
  • the method further comprises: prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays, detecting, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and displaying, at the fold region, the one or more shortcut overlays.
  • the one or more shortcut overlays is a windows management overlay.
  • the method fiirther comprises: detecting a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, removing the one or more shortcut overlays from display in the fold region.
  • the one or more shortcut overlays is an app multiplier overlay.
  • the method fiirther comprises: detecting the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and triggering a second action corresponding to the second one of the one or more shortcut overlays.
  • a computer-implemented method comprising: detecting a drag gesture in a first direction along a fold region of a flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and inserting a virtual bookmark at a currently displayed location of an electronic document.
  • the drag gesture begins at or near a first edge of the fold region.
  • the method further comprises: detecting a drag gesture in a second direction along the fold region substantially opposite to the first direction; and removing the virtual bookmark.
  • a computer-implemented method comprising: displaying content on a flexible touchscreen element using full screen mode; detecting a drag gesture in a first direction along a fold region of the flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and displaying the content using dual screen mode.
  • a computer-implemented method comprising: detecting a first touch gesture at a fold region of a flexible touchscreen element; and displaying a defined region containing application shortcuts.
  • a computer-implemented method comprising: displaying, at a fold region of a flexible touchscreen element, one or more application control overlays defined by an application being executed; detecting a touch gesture at a location of the fold region corresponding to one of the one or more application control overlays; and triggering a first action corresponding to the one of the one or more application control overlays.
  • a computer-implemented method comprising: detecting a grip posture of a foldable touchscreen device, the foldable touchscreen device have a flexible touchscreen element; in response to detecting the grip posture, displaying, at a fold region of the flexible touchscreen element, a thumb input overlay; detecting a touch gesture at a location of the fold region corresponding to the thumb input overlay; and triggering a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
  • the foldable touchscreen device is executing a video game application and the first action is a video game control action.
  • the foldable touchscreen device is executing a camera application and the first action is a camera control action.
  • a computer-implemented method comprising: displaying one or more user modifiable objects on a flexible touchscreen element; receiving a selection of one of the one or more user modifiable objects; displaying, at a fold region of the flexible touchscreen element, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modifying the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
  • FIG. 1 illustrates a front view of a first example foldable touchscreen device, in accordance with examples of the present disclosure
  • FIG. 2 is a high-level operation diagram of an example computing system, in accordance with examples of the present disclosure
  • FIG. 3 depicts a simplified organization of software components that may be stored in memory of the example computing system of FIG. 2, in accordance with examples of the present disclosure
  • FIG. 4 is a flowchart of an example method for triggering a first action, in accordance with examples of the present disclosure
  • FIG. 5 illustrates three instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure
  • FIG. 6 is a flowchart of an example method for triggering a second action, in accordance with examples of the present disclosure.
  • FIG. 7 illustrates a first four instances in connection with the example method for triggering a second action, in accordance with examples of the present disclosure;
  • FIG. 8 illustrates a second four instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure
  • FIG.s 9A-9E illustrate a first five instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure
  • FIG. 10 illustrates second five instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure
  • FIG. 11 is a flowchart of an example method for inserting and removing a virtual bookmark at a currently displayed location of an electronic document, in accordance with examples of the present disclosure
  • FIG. 12 illustrates a foldable touchscreen device providing a first display of an electronic document, in accordance with examples of the present disclosure
  • FIG. 13 illustrates a foldable touchscreen device providing a second display of an electronic document, in accordance with examples of the present disclosure
  • FIG. 14 is a flowchart of an example method of displaying content using dual screen mode, in accordance with examples of the present disclosure
  • FIG. 15 illustrates a foldable touchscreen device displaying content using fullscreen mode, in accordance with examples of the present disclosure
  • FIG. 16 illustrates a foldable touchscreen device displaying content using dual screen mode, according to examples of the present disclosure
  • FIG. 17 is a flowchart of an example method for displaying a defined region containing application shortcuts, in accordance with examples of the present disclosure.
  • FIG. 18 is a flowchart of an example method for triggering a first action corresponding to one or more shortcut overlays, in accordance with examples of the present disclosure;
  • FIG. 19 illustrates a foldable touchscreen device providing a first display of a video application, in accordance with examples of the present disclosure
  • FIG. 20 illustrates a foldable touchscreen device providing a second display of a video application, in accordance with examples of the present disclosure
  • FIG. 21 is a flowchart of an example method for triggering a first action corresponding to a touch gesture and a location corresponding to a thumb input overlay, in accordance with examples of the present disclosure
  • FIG. 22 illustrates a foldable touchscreen device displaying a video game application, in accordance with examples of the present disclosure
  • FIG. 23 illustrates a foldable touchscreen device displaying a camera application, in accordance with examples of the present disclosure
  • FIG. 24 is a flowchart of an example method of modifying one of one or more user modifiable objects, in accordance with examples of the present disclosure
  • FIG. 25 illustrates a foldable touchscreen device providing a first display of a word processing application, in accordance with examples of the present disclosure
  • FIG. 26 illustrates a foldable touchscreen device providing a second display of a word processing application, in accordance with examples of the present disclosure
  • FIG. 27 illustrates a foldable touchscreen device providing a first display of a design application, in accordance with examples of the present disclosure.
  • FIG. 28 illustrates a foldable touchscreen device providing a second display of a design application, in accordance with examples of the present disclosure. Similar reference numerals may have been used in different figures to denote similar components.
  • Embodiments described herein may operate on a variety of foldable touchscreen devices, such as dual screen laptops, foldable laptops, standard laptops, tablets, smart phones, and the like.
  • computing system refers to an electronic device having computing capabilities.
  • Examples of computing systems include but are not limited to: personal computers, laptop computers, tablet computers (“tablets”), smartphones, surface computers, augmented reality gear, and the like.
  • touchscreen element and “touchscreen” refer to a combination of a display together with a touch sensing system that is capable of acting as an input device by receiving a touch input.
  • touchscreen displays are: surface capacitive touchscreens and projected capacitive touchscreens.
  • touchscreen device refers to a computer system having a touchscreen element.
  • the term “application” refers to a software program comprising of a set of instructions that can be executed by a processing device of an electronic device.
  • FIG. 1 illustrates an example foldable touchscreen device 100, which is an example operating environment of an example embodiment.
  • the example foldable touchscreen device 100 includes a flexible touchscreen element 140.
  • the flexible touchscreen element 140 may be a foldable touchscreen element that is foldable at a fold region 135.
  • the touchscreen element 140 may be operable to render content and to sense touch thereupon.
  • the touchscreen element 140 may also be described as a touchscreen 140.
  • the touchscreen element 140 may implement one or more touchscreen technologies.
  • the touchscreen element 140 may be a capacitive touchscreen, (e.g., surface capacitive, projected capacitive, mutual capacitive, self-capacitive), a resistive touchscreen, a surface acoustic wave (SAW) touchscreen, etc.
  • SAW surface acoustic wave
  • the example foldable touchscreen device 100 includes a body 105, which may be formed of plastic, glass, fiber, stainless steel, aluminum, other suitable materials, or a combination of any of these materials.
  • the body 105 encloses multiple components of the example foldable touchscreen device 100 including a processor and a memory.
  • the body 105 is configured to house the flexible touchscreen element 140.
  • the body may be described as comprising a fust portion 110 and a second portion 115 connected through a fold edge 120.
  • the flexible touchscreen element 140 provides for the folding of the example foldable touchscreen device 100 into various physical configurations including a flat configuration.
  • the flexible touchscreen element 140 can fold about the fold edge 120 so that the first portion 110 and the second portion 115 of the example foldable touchscreen device 100 move towards and/or away from each other.
  • the fold edge 120 may be manufactured from a bendable, flexible material such as flexible polymer.
  • the flexible touchscreen element 140 of the example foldable touchscreen device 100 has three touchscreen partitions: a first partition 125, a second partition 130, and a fold region 135 separating the first and second partitions 125, 130.
  • the fold region 135 may be also known as the hinge.
  • the fold region 135 may be defined by a first border 150 along an edge of the first partition 125 and a second border 155 along an edge of the second partition 130.
  • the first and second borders 150, 155 may not be physical borders, but rather may be logically defined and may be dynamically shifted (or omitted). It should be appreciated that the terms “first” and “second” are not intended to be limiting.
  • a touchscreen partition as “first” or “second” may be arbitrary; a touchscreen partition that is designated as “first” in one instance may be designated as “second” in another instance, and vice versa.
  • the designation of “first” and “second” borders may be arbitrary.
  • FIG. 1 illustrates a laptop computer
  • the example foldable touchscreen device 100 may be a smartphone, a tablet, and/or other similar electronic device.
  • the example foldable touchscreen device 100 may be a type of computing system within the scope of the present disclosure.
  • FIG. 2 is a high-level operation diagram of an example computing system 200, in accordance with examples of the present disclosure.
  • the example computing system 200 may be exemplary of the example touchscreen device 100 (FIG. 1) and is not intended to be limiting.
  • the example computing system 200 includes a variety of components.
  • the example computing system 200 may include a processor 302, an input/output (I/O) interface 304, a network interface 306, a storage unit 378 and a memory 380.
  • the foregoing example components of the computing system 200 are in communication over a bus 308.
  • the bus 308 is shown providing communication among the components of the computing system 200.
  • the bus 308 may be any suitable bus architecture including, for example, a memory bus, a peripheral bus or a video bus.
  • the processor 302 may include one or more processors, such as a microprocessor, a microcontroller, an application-specific integrated circuit (ASIC), a field- programmable gate array (FPGA), a dedicated logic circuitry, or combinations thereof.
  • processors such as a microprocessor, a microcontroller, an application-specific integrated circuit (ASIC), a field- programmable gate array (FPGA), a dedicated logic circuitry, or combinations thereof.
  • the network interface 306 may include one or more network interfaces for wired or wireless communication with a network (e.g., an intranet, the Internet, a peer-to-peer (P2P) network, a wide area network (WAN) and/or a local area network (LAN) or other node.
  • the network interface 306 may include wired links (e.g., Ethernet cable) and/or wireless links (e.g., one or more antennas) for intra-network and/or inter-network communications.
  • the storage unit 378 may be one or more storage units, and may include a mass storage unit such as a solid state drive, a hard disk drive, a magnetic disk drive and/or an optical disk drive.
  • a mass storage unit such as a solid state drive, a hard disk drive, a magnetic disk drive and/or an optical disk drive.
  • the I/O interface 304 may be one or more I/O interfaces, and may enable interfacing with one or more appropriate input devices, such as the touch panel 344, and/or one or more appropriate output devices, such as the touchscreen display 342.
  • the touch panel 344 and the touchscreen display 342 form part of the touchscreen element 140.
  • the touch panel 344 may include a variety of touch sensors for sensing touch input, which may depend on the touch sensing modality used by the touchscreen element 140 (e.g., capacitive sensors).
  • the computing system 200 may include one or more memories 380, which may include a volatile (e.g. random access memory (RAM)) and non-volatile or non-transitory memories (e.g., a flash memory, magnetic storage, and/or a read-only memory (ROM)).
  • RAM random access memory
  • ROM read-only memory
  • the non-transitory memory (ies) of memories 380 store programs that include software instructions for execution by the processor 302, such as to carry out examples described in the present disclosure.
  • the programs include software instructions for implementing an operating system (OS) and software applications.
  • OS operating system
  • the memory 380 may include software instructions of the computing system 200 for execution by the processor 302 to carry out the operations described in this disclosure.
  • one or more data sets and/or modules may be provided by an external memory (e.g., an external drive in wired or wireless communication with the computing system 200) or may be provided by a transitory or non-transitory computer - readable medium.
  • Examples of non-transitory computer readable media include a RAM, a ROM, an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a CD-ROM, or other portable memory storage.
  • FIG. 3 depicts a simplified organization of software components that may be stored in memory 380 of the example computing system 200 (FIG. 2). As illustrated, these software components include application software 350 and an operating system (OS) 310.
  • OS operating system
  • the OS 310 is software.
  • the OS 310 allows the application software 350 to access the processor 302, the memory 380, the I/O interface 304, the network interface 306, and the storage unit 378 (FIG. 2).
  • the OS 310 may be, for example, AppleTM iOSTM, AndroidTM, MicrosoftTM WindowsTM, GoogleTM ChromeOSTM, or the like.
  • the application software 350 adapts the example computing system 200 (FIG. 2), in combination with the OS 310, to operate as a device performing a particular function.
  • the application software 350 may adapt the example computing system 200 (FIG. 2) to perform fold angle determination.
  • FIG. 4 is a flowchart of a method 400 for triggering a first action, in accordance with examples of the present disclosure.
  • the method 400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the computing system displays a first object in the first partition of a flexible touchscreen element.
  • the system detects, at a location of the first partition corresponding to the first object, an initialization of a drag gesture.
  • the system detects a path of the drag gesture passing near or through the fold region at a first one of the one or more shortcut overlays.
  • the system triggers a first action corresponding to the first one of the one or more shortcut overlays.
  • FIG. 5 illustrates three instances 502, 504, 506 of a method for triggering a first action, in accordance with examples of the present disclosure.
  • the three instances 502, 504, 506 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 400 of FIG. 4.
  • Each of the three instances 502, 504, and 506 shows a foldable touchscreen device 100 having a flexible touchscreen element 140 comprising a first partition 125, a second partition 130, and a fold region 135.
  • Four shortcut overlays 508 comprising are displayed at the fold region 135 of each foldable touchscreen device 100 illustrated at instances 502, 504, and 506. It may be appreciated that the first partition 125 and second partition 130 indicated in FIG.
  • a first circle 510 is shown displayed at the fust partition 125 of the foldable touchscreen device 100.
  • the first circle 510 is a first object.
  • the first circle 510 has been selected and dragged along the first partition 125 and approaches the fold region 135. Specifically, the first circle 510 is shown approaching the first shortcut overlay of the four shortcut overlays 508 at the fold region 135.
  • the four shortcut overlays 508 may be described as a dynamic hinge shortcut overlay.
  • the first shortcut overlay of the dynamic hinge shortcut overlay displays the word “copy” and represents a copy function shortcut.
  • the fold region 135 may act as a natural bridge region between the first and second partitions 125, 130 of a flexible touchscreen element 140 of a foldable touchscreen device 100.
  • Providing a contextual shortcut overlay, such as a dynamic hinge shortcut overlay, as a portal at the hinge may allow users to transform an object (such as the first circle 510) between the first and second partitions 125, 130.
  • the foldable touchscreen device 100 may detect an initialization of a drag gesture at a location of the first partition 125 corresponding to the first circle 510.
  • the first circle 510 has been dragged through the fold region 135 at the first shortcut overlay and is now displayed at the second partition 130.
  • the first shortcut overlay displays the word “copy” and represents a copy function shortcut.
  • the system detects a path of the drag gesture passing near or through a first one of one or more shortcut overlays. As a result, the system triggers a first action corresponding to the first one of the one or more shortcut overlays (in this example, a “copy” function corresponding to the first shortcut overlay).
  • a second circle 512 now appears at the first partition 125.
  • the second circle 512 is a second object.
  • the second circle 512 is a copy of the first circle 510.
  • a “copy” function may be completed in a single step by dragging an object towards the fold region 135 (hinge) to access a dynamic hinge shortcut overlay.
  • this example illustrates the path of the drag gesture crossing the fold region 135 from the first partition 125 to the second partition 130, this is not intended to be limiting.
  • a path for a drag gesture that starts in the first partition 125, approaches or touches the desired shortcut overlay the fold region 135 and returns to the first partition 125 (i. e., without crossing into the second partition 130) may be sufficient.
  • the “copy” function may be completed using a MicrosoftTM Office application or a whiteboard application, among other possibilities.
  • FIG. 5 may be particularly useful when a user is working with an immersive or dual-screen view that utilizes both first and second partitions 125, 130 and needs to frequently copy and paste content between both first and second partitions 125, 130.
  • Different actions may be performed, corresponding to different ones of the one or more shortcut overlays, depending on which shortcut overlay has been crossed or approached by the path of the drag gesture.
  • Other actions may include various actions that may be performed on a selectable object, for example resize action, delete action, etc. and may depend on the software application being executed.
  • the one or more shortcut overlays may be displayed at the fold region prior to the detection of the initialization of the drag gesture at the operation 420. In some embodiments, the one or more shortcut overlays may be displayed as a result of the system detecting the initialization of the drag gesture, or as a result of the system detecting, at a location of the first partition within a proximity of the fold region, the drag gesture.
  • the system may detect a cessation of the drag gesture. In some embodiments, in response to detecting the cessation of the drag gesture, the system may remove the one or more shortcut overlays from display in the fold region.
  • FIG. 6 is a flowchart of an example method 600 for triggering a second action, in accordance with examples of the present disclosure.
  • the method 600 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the method 600 of FIG. 6 comprises six operations 410, 420, 430, 440, 610, 620.
  • the first four operations 410, 420, 430, 440 of the method 600 of FIG. 6 are the same as the operations 410, 420, 430, 400 of the method 400 of FIG. 4.
  • the computing system displays a first object in the first partition of a flexible touchscreen element.
  • the system detects, at a location of the first partition corresponding to the first object, an initialization of a drag gesture.
  • the system detects a path of the drag gesture passing near or through the fold region at a first one of the one or more shortcut overlays.
  • the system triggers a first action corresponding to the fust one of the one or more shortcut overlays.
  • the system detects the drag gesture crossing the fold region at a second one of the one or more shortcut overlays.
  • the system triggers a first action corresponding to the fust one of the one or more shortcut overlays.
  • FIG. 7 illustrates a first four instances 502, 504, 506, 702 in connection with a method for triggering a second action, in accordance with examples of the present disclosure.
  • the four instances 502, 504, 506, 702 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 600 of FIG. 6.
  • the first three instances 502, 504, 506 of FIG. 7 are the same as the three instances 502, 504, 506 of FIG. 5.
  • the first partition 125 and second partition 130 indicated in FIG. 7 may correspond to the second partition 130 and first partition 125 shown in FIG. 1; as previously mentioned, the designation of “first” and “second” may be arbitrary and are not intended to be limiting.
  • a first circle 510 is shown displayed at the fust partition 125 of the foldable touchscreen device 100.
  • the first circle 510 is a first object.
  • the first circle 510 has been selected and dragged along the first partition 125 and approaches the fold region 135. Specifically, the first circle 510 is shown approaching the first shortcut overlay of the four shortcut overlays 508 at the fold region 135.
  • the four shortcut overlays 508 may be described as a dynamic hinge shortcut overlay.
  • the first shortcut overlay of the dynamic hinge shortcut overlay displays the word “copy” and represents a copy function shortcut.
  • the fold region 135 may act as a natural bridge region between the first and second partitions 125, 130 of a flexible touchscreen element 140 of a foldable touchscreen device 100.
  • Providing a contextual shortcut overlay, such as a dynamic hinge shortcut overlay, as a portal at the hinge may allow users to transform an object (such as the first circle 510) between the first and second partitions 125, 130.
  • the foldable touchscreen device 100 may detect an initialization of a drag gesture at a location of the first partition 125 corresponding to the first circle 510.
  • the first circle 510 has been dragged through the fold region 135 at the first shortcut overlay and is now displayed at the second partition 130.
  • the first shortcut overlay displays the word “copy” and represents a copy function shortcut.
  • the system detects a path of the drag gesture passing near or through a first one of one or more shortcut overlays. As a result, the system triggers a first action corresponding to the first one of the one or more shortcut overlays (in this example, a “copy” function corresponding to the first shortcut overlay).
  • a second circle 512 now appears at the first partition 125.
  • the second circle 512 is a second object.
  • the second circle 512 is a copy of the first circle 510.
  • the first circle 510 has passed through the fold region 135 from the second partition 130 back to the first partition 125 and is now displayed at the first partition 125.
  • the second circle 512 remains displayed at the second partition 125.
  • the first circle 510 has passed through the fold region 135 at a second one of the four shortcut overlays 508.
  • the second shortcut overlay corresponds to a changing color action.
  • a second action has been triggered, in this example the color of the first circle 510 has been changed.
  • the path of the drag gesture may pass through or near any number (e.g., more than two) of shortcut overlays displayed in the fold region 135.
  • operations 610-620 of the method 600 of FIG. 6 may be repeated multiple times.
  • dragging an object such as the first circle 510 and/or the second circle 512
  • the various commands may be used in combination by dragging the object through or near the fold region 135 multiple times (corresponding to multiple shortcut overlays) within the same drag gesture.
  • a repetitive hinge crossing gesture may execute a combination of commands to trigger actions. Examples of commands which may be executed by in this manner may include “save” and “full screen”.
  • a repetitive hinge crossing gesture may benefit a user working in a Microsoft Office application, a design application, a whiteboard application or other such applications on a foldable touchscreen device.
  • a user may want to quickly duplicate content and assign new attributes such as color and position.
  • a user may drag a desired object through or near the fold region and perform a specific gesture to duplicate the object and assign new attributes in a single step, streamlining the process of creating objects (e.g., charts) with repeated elements, and improving the efficiency and convenience of the task.
  • FIG. 8 illustrates a second four instances 802, 804, 806, 808 in connection with a method for triggering a first action, in accordance with examples of the present disclosure.
  • the four instances 802, 804, 806, 808 may illustrate different operations of a foldable touchscreen device 100 in some example implementations of the method 400 of FIG. 4 or the method 600 of FIG. 6.
  • Each of the four instances 802, 804, 806, 808 show a foldable touchscreen device 100 having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the triggered action may be related to windows management.
  • a windows management overlay 810 is displayed at the fold region 135 of each foldable touchscreen device 100 illustrated at the four instances 802, 804, 806, 808.
  • the windows management overlay 810 comprises four shortcut overlays.
  • the four instances 802, 804, 806, 808 of FIG. 8 relate to use of the fold region 135 as a portal for system windows management assistance on a foldable touchscreen device 100.
  • a first window 812 (i.e., the fust object) is shown displayed at the first partition 125 of the flexible touchscreen element 140 of the foldable touchscreen device 100.
  • the first window 812 has been dragged along the first partition 125 and through the first windows management shortcut overlay of the windows management overlay 810 in the fold region 135.
  • the windows management overlay 810 may be described as a dynamic hinge shortcut overlay.
  • Each of the four shortcut overlays of the windows management overlay 810 may correspond to a different windows layout action.
  • the first shortcut overlay of the dynamic hinge shortcut overlay represents an “expand” function shortcut.
  • the first window 812 is shown expanded such that it occupies the entirety of the second partition 130.
  • an expand action may be completed in a single step by dragging an object towards the fold region 135 (hinge) to access a windows management shortcut overlay.
  • the third instance 806 illustrates the first window 812 after being dragged along the first partition 125 and through the second shortcut overlay of the windows management overlay 810 in the fold region 135.
  • the second shortcut overlay of the dynamic hinge shortcut overlay represents a “semi-expand” action shortcut.
  • the first window 812 is shown semi -expanded such that it occupies the leftmost half of the second partition 130.
  • a semi-expand action may be completed in a single step by dragging an object (such as the first window 812) towards the fold region 135 (hinge) to access the windows management overlay 810.
  • the fourth instance 808 illustrates the first window 812 after being dragged along the first partition 125 and through the third shortcut overlay of the windows management overlay 810 in the fold region 135.
  • the third shortcut overlay of the dynamic hinge shortcut overlay represents a “close” action shortcut.
  • the first window 812 is shown having closed.
  • a “close” action may be completed in a single step by dragging an object towards the fold region (hinge) to access a corresponding windows management shortcut overlay.
  • the fold region 135 may be used as a portal for system window management assistance on a foldable touchscreen device 100.
  • a user may access a window management overlay.
  • the windows management overlay may allow the user to easily rearrange the layout of windows on the device by dragging one or more windows across various layout commands of the windows management overlay.
  • the hinge portal provides a natural transition for managing windows across screens, addressing a common issue with foldable personal computers (PCs), which may encounter difficulty in moving windows across the hinge.
  • the interactions described with respect to FIG. 8 may streamline the process of organizing windows and improve the overall efficiency of windows management on foldable touchscreen devices. Further, it should be understood that although FIG. 8 illustrates examples where a first action is triggered by dragging the window, in some examples the drag gesture may further pass through or near one or more additional shortcut overlays in the fold region, to cause triggering of one or more additional actions to be performed.
  • FIGs. 9A-9E illustrate a first five instances 902, 904, 906, 908, 910 in connection with an example method for triggering a first action, in accordance with examples of the present disclosure.
  • the five instances 902, 904, 906, 908, 910 may illustrate operations of a foldable touchscreen device 100 in some example implementations of the method 400 of FIG. 4 or the method 600 of FIG. 6.
  • Each of the five instances 902, 904, 906, 908, 910 show a foldable touchscreen device 100 having flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in a “book mode” orientation in which the fold region 135 is oriented vertically instead of horizontally.
  • the flexible touchscreen element 140 is shown displaying online content via a web browser apphcation being executed on the foldable touchscreen device 100.
  • the triggered action may be related to browser an onhne website.
  • FIG. 9A illustrates instance 902, which shows the display of a list view of objects at the first and second partitions 125, 130.
  • Each of the objects is represented by a displayed image and may represent a hyperlink.
  • FIG. 9B illustrates instance 904, which shows the display of a list view of objects at the first and second partitions 125, 130.
  • Instance 904 additionally illustrates the finger of a user hand 912 initiating a drag operation beginning at a first one of the objects at the first partition 125.
  • Three shortcut overlays 914 are displayed at the fold region 135.
  • the three shortcut overlays 914 may correspond to shortcuts corresponding to an “App Launcher” feature and/or to a “Parallel View” feature.
  • An “App Launcher” feature and/or a “Parallel View” feature may allow a user to browse a list view of objects at the second partition 130, while opening detailed views of one or more of the objects at the first partition 125, among other functions.
  • the three shortcut overlays 914 may correspond to actions for opening detailed views of one or more selected objects.
  • the display of the three shortcut overlays 914 may result from a detection of a drag gesture at a location of the first partition 125 within a proximity of the fold region 135.
  • the shortcut overlays 914 may be displayed prior to the detection of the drag gesture, or may be displayed after the drag gesture has been initiated but prior to the drag gesture being in proximity of the fold region 135.
  • the path of the drag operation is shown extending from the first one of the objects at the first partition 125 toward a first shortcut overlay of the three shortcut overlays 914 at the fold region 135.
  • FIG. 9C illustrates instance 906, which shows the foldable touchscreen device 100 after the execution of the action corresponding to the first shortcut overlay.
  • the second partition 130 displays a list view of objects. Each of the objects is represented by a displayed image and may represent a hyperlink.
  • the first partition 125 displays a detailed view corresponding to the first one of the objects.
  • FIG. 9D illustrates instance 908, which shows the display of a list view of objects at the second partition 130, and illustrates the display of a detailed view of three objects at the first partition 125.
  • Instance 908 additionally illustrates the finger of the user hand 912 initiating a drag operation beginning at a first one of the objects at the first partition 125.
  • the fold region 135 displays four shortcut overlays 916.
  • the four shortcuts overlays 916 may correspond to browser actions such as “unlink”, “close”, flip”, and/or “go back”.
  • the display of the four shortcut overlays 916 may result from a detection of a drag gesture at a location of the fust partition 125 within a proximity of the fold region 135.
  • the shortcut overlays 916 may be displayed prior to the detection of the drag gesture, or may be displayed after the drag gesture has been initiated but prior to the drag gesture being in proximity of the fold region 135.
  • the path of the drag operation is shown extending from the first one of the objects at the first partition 125 toward the fourth shortcut overlay of the four shortcut overlays 916 at the fold region 135.
  • the fourth shortcut overlay of the four shortcut overlays 916 corresponds to a “close” action.
  • FIG. 9E illustrates instance 910, which shows the foldable touchscreen device 100 after the execution of the “close” action.
  • the first partition 125 displays a detailed view of two objects, the detailed view of the first one of the objects having been removed as a result of the “close” action.
  • the second partition 130 displays a list view of objects.
  • the list view of objects includes the first one of the objects .
  • the four shortcut overlays 916 have been removed from the fold region 135, for example in response to detecting cessation of the drag gesture. In other examples, the shortcut overlays 916 may be displayed in the fold region 135 even in absence of a detected drag gesture.
  • the hinge may be used as a portal to optimize a “Parallel View” and/or “app launcher” feature of a browser application.
  • “Parallel View” and/or “app Launcher” features may be extended to foldable touchscreen devices, allowing users to easily manage multiple sub-pages at the same time.
  • FIGs. 10 A- 10E illustrate a second five instances 1002, 1004, 1006, 1008, 1010 in connection with a method for triggering a first action, in accordance with examples of the present disclosure.
  • the five instances 1002, 1004, 1006, 1008, 1010 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 400 of FIG. 4 or the method 600 of FIG. 6.
  • Each of the five instances 1002, 1004, 1006, 1008, 1010 show a foldable touchscreen device 100 having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the triggered action may be related to management of layers in, for example, a design application.
  • a first ordering of layers 1012 is displayed at the first partition 125 of the flexible touchscreen element 140 of the foldable touchscreen device 100.
  • a finger of a user hand 912 is shown touching a first layer of the ordering of layers 1012.
  • the first layer is the front most layer of the ordering of layers 1012.
  • the first layer is an object.
  • the finger of the user hand 912 is shown initializing a drag gesture of the first layer towards the fold region 135.
  • one or more shortcut overlays 1014 are displayed at the fold region 135.
  • the finger of the user hand 912 is shown touching a dragged copy of the first layer at the location of one of the one or more shortcut overlays 1014.
  • the one of the one or more shortcut overlays 1014 corresponds to a “Layer” action shortcut.
  • the finger of the user hand 912 is shown continuing the drag gesture along the fold region 135 at the “Layer” action shortcut to choose and effect an adjustment of the ordering of the ordering of layers 1012.
  • the first layer is shown displayed at the fold region 135 at the location of the finger of the user hand 912 performing the drag gesture.
  • the first layer is shown being dragged towards the remaining layers of the first ordering of layers 1012.
  • the first layer of the first ordering of layers 1012 is shown being dragged to the backmost position of the remaining layers of the first ordering of layers 1012.
  • the finger of the user hand 912 is shown at the location of the first layer.
  • examples of the present disclosure may provide the ability to adjust the position/ordering of layers and other objects in a design application (e. g. , a PowerpointTM application) .
  • a design application e. g. , a PowerpointTM application
  • Examples of the present disclosure may avoid a requirement to use a combination of keyboard shortcut keys, right-click menus, and/or ribbon menus to perform such actions, which may be time-consuming and inconvenient.
  • Examples of the present disclosure may allow a user to simply drag an object to a layer command at the fold region and then slide to adjust a layer order, then relocate the object.
  • FIG. 11 is a flowchart of an example method 1100 for inserting and removing a virtual bookmark at a currently displayed location of an electronic document, in accordance with examples of the present disclosure.
  • the method 1100 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 1110 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system detects a drag gesture in a first direction along the fold region.
  • the first direction may be substantially parallel to a longitudinal axis of the fold region.
  • the system inserts a virtual bookmark at a currently displayed location of the electronic document.
  • the system removes the virtual bookmark from the currently displayed location of the electronic document.
  • FIG. 12 illustrates a foldable touchscreen device 100 providing a first display of an electronic document 1204, in accordance with examples of the present disclosure.
  • FIG. 12 may illustrate an example partial implementation of the method 1100 of FIG. 11.
  • the foldable touchscreen device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in “book mode” orientation.
  • a longitudinal axis 1202 of the fold region 135 is shown extending through the illustration of the foldable touchscreen device 100.
  • the flexible touchscreen element 140 is shown displaying an electronic document 1204 (e.g., an electronic book).
  • a first page of the electronic document 1204 is shown displayed at the first partition 125 and a second page of the electronic document 1204 is shown displayed at the second partition 130.
  • a finger of a user hand 912 is shown initiating a drag gesture at the fold region 135.
  • the finger of the user hand 912 is shown touching the fold region 135 near the top of the flexible touchscreen element 140, and a perforated downward facing arrow 1206 is displayed to indicate a downward direction of the drag gesture.
  • a drag gesture is shown as being initiated in a first direction along the fold region 135.
  • the first direction is substantially parallel to the longitudinal axis 1202 of the fold region 135.
  • a virtual bookmark may be inserted at the currently displayed position of the electronic document 1204.
  • the use of a drag gesture to insert a bookmark in an electronic document may be intuitive and natural for a user, and may provide a seamless and cohesive experience.
  • a bookmark may be removed from the electronic document.
  • the system may, in response to detecting a drag gesture in a second direction along the fold region substantially opposite the first direction, remove the virtual bookmark.
  • FIG. 13 illustrates a foldable touchscreen device 100 providing a second display of an electronic document 1204, in accordance with examples of the present disclosure.
  • FIG. 13 may illustrate an example partial implementation of the method 1100 of FIG. 11.
  • the foldable electronic device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in a “book mode” orientation.
  • a longitudinal axis 1202 of the fold region 135 is shown extending through the illustration of the foldable touchscreen device 100.
  • the flexible touchscreen element 140 is shown displaying an electronic document 1204.
  • a first page of the electronic document 1204 is shown displayed at the first partition 125 and a second page of the electronic document 1204 is shown displayed at the second partition 130.
  • a virtual bookmark 1302 is displayed along an upper portion of the fold region 135.
  • the virtual bookmark 1302 may have been inserted as a result of the operations illustrated in FIG. 12.
  • a finger of a user hand 912 is shown initiating a drag gesture at the fold region 135.
  • the finger of the user hand 912 is shown touching the fold region 135 near the bottom of the virtual bookmark 1302, and a perforated upward facing arrow 1304 is displayed to indicate an upward direction of the drag gesture.
  • a drag gesture is shown as being initiated in a second direction, substantially opposite to the first direction, along the fold region 135. As shown, the second direction is substantially parallel to the longitudinal axis 1202 of the fold region 135.
  • the system may remove the virtual bookmark 1302.
  • FIG. 14 is a flowchart of an example method 1400 of displaying content using dual screen mode, in accordance with examples of the present disclosure.
  • the method 1400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 1410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system displays content using full screen mode.
  • the system detects a drag gesture in a first direction along the fold region.
  • the system displays the content using dual screen mode.
  • FIG. 15 illustrates a foldable touchscreen device 100 displaying content using immersive fullscreen mode, in accordance with examples of the present disclosure.
  • the foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the flexible touchscreen element 140 is shown displaying content in full screen mode.
  • Two fingers of a user hand 912 are shown initiating a drag gesture at the fold region 135.
  • a perforated line 1502 having an arrow is displayed at the fold region 135 indicating the direction of the drag gesture.
  • FIG. 16 shows that, as a result of the detected drag gesture performed in FIG. 15, content is now displayed on the flexible touchscreen element 140 in dual screen mode.
  • FIG. 17 is a flowchart of an example method 1700 for displaying a defined region containing application shortcuts, in accordance with examples of the present disclosure.
  • the method 1700 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system detects a first touch gesture at the fold region.
  • the first touch gesture may be a multi-finger gesture, such as a two-finger tap.
  • the system displays a defined region containing application shortcuts.
  • a defined region containing application shortcuts such as an “app dock” may be triggered to appear at the fold region, allowing the user to easily open applications on either screen.
  • a touch gesture may provide a convenient way for users to access and launch applications without having to navigate through multiple menus or screens.
  • This example may be particularly useful in foldable touchscreen devices that offer multitasking capabilities, such as foldable laptops or tablets.
  • foldable laptops or tablets By providing a simple and intuitive gesture for activating an app dock, users may easily and quickly access the applications they need, improving their productivity and efficiency.
  • FIG. 18 is a flowchart of an example method 1800 for triggering a first action corresponding to one or more application shortcuts.
  • the method 1800 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system displays, at the fold region, one or more application control overlays.
  • the application control overlay(s) may provide for control of a currently executed application.
  • the system detects a touch gesture at a location of the fold region corresponding to the one or more application control overlays.
  • the touch gesture may be a multi-finger gesture.
  • the system triggers a first action corresponding to the one or more application control overlays.
  • FIG. 19 illustrates a foldable touchscreen device providing a first display of a video application, in accordance with examples of the present disclosure.
  • FIG. 19 illustrates an example implementation of the method 1800 of FIG. 19.
  • the foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the flexible touchscreen element 140 is shown displaying content in dual screen mode, and the first partition 125 is shown displaying a currently executed video player application.
  • First and second application control overlays 1902, 1904 are displayed at the fold region 135.
  • the first application control overlay 1902 is a brightness control overlay and the second application control overlay 1904 is avohime control overlay.
  • a finger of a user hand 912 is shown touching the fold region 135 at a location of the display of the second application control overlay 1904.
  • the application may increase, or decrease, the volume of the video player application.
  • the application may increase, or decrease, the brightness of the video player application.
  • FIG. 20 illustrates a foldable touchscreen device 100 providing a second display of a video application, in accordance with examples of the present disclosure.
  • the foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in tent mode orientation.
  • the first partition 125 and the fold region 135 of the flexible touchscreen element are visible, and the foldable touchscreen device 100 is shown executing a video player application.
  • First and second application control overlays 2002, 2004 are displayed at the fold region 135, which, due to the tent mode orientation, is at the top of the foldable touchscreen device 100.
  • the first application control overlay 2002 is a brightness control overlay and the second application control overlay 2004 is a volume control overlay.
  • a finger of a user hand 912 is shown touching the fold region 135 at a location of the display of the second application control overlay 2004. Touch gestures may be detected in the fold region 135 to trigger application control actions corresponding to the application control overlays 2002, 2004 in a similar manner to that described above with respect to FIG. 19.
  • FIG. 21 is a flowchart of an example method 2100 for triggering a first action corresponding to a touch gesture and a location corresponding to a thumb input overlay, in accordance with examples of the present disclosure.
  • the method 2100 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system detects a grip posture of the foldable touchscreen device.
  • a grip posture may refer to a position of a user’s hands with respect to a partially folded touchscreen device 100 such that the user’s thumbs are positioned near the fold region 135 and one or more of the remaining fingers are beneath the device 100, as shown for example, in FIGs. 22 and 23.
  • a grip posture may be detected using one or more of a variety of methods, including capacitive touchscreen sensing with or without use of internal inertial measurement unit (IMU) sensors and/or a camera.
  • IMU inertial measurement unit
  • capacitive touchscreen sensing may detect a user’s thumbs near the fold region 135, and/or the palms of two hands at corresponding positions of the first partition 125 of the flexible touchscreen element 140.
  • an internal IMU may detect both a partially folded posture of the touchscreen device 100, and the initialization and maintenance of the second partition 130 of the flexible touchscreen element 140 in a raised position.
  • the system displays, at the fold region, a thumb input overlay.
  • the system detects a touch gesture at a location of the fold region corresponding to the thumb input overlay. [00193] At the operation 2140, the system triggers a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
  • FIG. 22 illustrates a foldable touchscreen device 100 executing a video game application, in accordance with examples of the present disclosure.
  • FIG. 22 illustrates an example implementation of the method 2100 of FIG. 21.
  • the foldable touchscreen device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in laptop orientation, in which the fold region 135 is oriented horizontally and the foldable touchscreen device 100 is semi -folded.
  • the foldable touchscreen device 100 is shown providing a display of a currently executed video game application in the second partition 130.
  • a thumb input overlay 2202 comprising virtual buttons are shown displayed near the bottom of the flexible touchscreen element 140, in the first partition 125.
  • a user’s left hand 2204 and the user’s right hand 2206 are shown gripping the bottom of the foldable touchscreen device 100, such that the user’s left thumb 2208 and the user’s right thumb 2210 are position in proximity to the virtual buttons of the thumb input overlay 2202 displayed in the first partition 125.
  • a foldable touchscreen device e.g., a smartphone
  • the fold region may be easily reachable by a user’s thumbs, and may thus provide a natural input space for tapping and/or sliding.
  • dynamic hinge shortcuts at the fold region additional dimensions may be added to the game control in an intuitive way.
  • a slide gesture at the fold region may be used to zoom in, to zoom out, or to change weapons.
  • Using dynamic hinge shortcuts at the fold region may provide a more immersive and intuitive gaming experience on a foldable touchscreen device, such as on a foldable smartphone.
  • the method 2100 of FIG. 21 may be performed when the foldable touchscreen device is executing a camera application.
  • FIG. 23 illustrates a foldable touchscreen device 100, executing a camera application, in accordance with examples of the present disclosure.
  • the foldable touchscreen device 100 is shown having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in laptop orientation and displaying a viewport of a currently executed a camera application in the first partition 125.
  • a thumb input overlay 2302 comprising first and second application control overlays 2304, 2306 are shown displayed at the fold region 135 of the flexible touchscreen element 140.
  • the first application control overlay 2304 is an exposure control overlay and the second application control overlay 2306 is a zoom control overlay.
  • Other user interface elements of the camera application may be displayed in the second partition 130.
  • a user’s left hand 2204 and the user’s right hand 2206 are shown gripping the bottom of the foldable touchscreen device 100, such that the user’s left thumb 2208 and the user’s right thumb 2210 are position in proximity to the first and second application control overlays, respectively.
  • dynamic sliding control may be provided in the fold region 135 for camera application actions such as “zoom” and “exposure”, for example.
  • FIGs. 22 and 23 illustrate examples of how application-specific control overlays may be displayed in the fold region 135, in a manner that can be intuitively and conveniently accessible to a user who is holding the foldable touchscreen device 100 in a semifolded position (e.g., a laptop orientation). It should be understood that other implementation of the method 2100 when executing other applications may be possible, without being limited to gaming or camera applications.
  • FIG. 24 is a flowchart of an example method 2400 of modifying one of one or more user modifiable objects, in accordance with examples of the present disclosure.
  • the method 2400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1).
  • the operations 2410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
  • the system displays one or more user modifiable obj ects.
  • the system receives a selection of one of the one or more user modifiable objects.
  • the system displays, at the fold region, one or more shortcut overlays corresponding to the one of the one or more user modifiable objects.
  • the system modifies the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
  • FIGs. 25-26 illustrate an example implementation of the method 2400 of FIG. 24.
  • FIG. 25 illustrates a foldable touchscreen device 100 providing a first display of a word processing application, in accordance with examples of the present disclosure.
  • the foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown in full screen mode and in laptop orientation.
  • the foldable touchscreen device 100 is shown executing a word processing application and displaying text.
  • the text may include user modifiable objects such as a selected portion of text.
  • a first section of text 2502 is shown as selected text, indicated by highlighting over the first section of text 2502.
  • a finger of a user hand 912 is shown touching the flexible touchscreen element 140 at a location of the first section of text 2502.
  • FIG. 26 illustrates that, following selection of the first section of text 2502, one or more shortcut overlays 2602 are shown displayed at the fold region 135.
  • the one or more shortcut overlays 2602 correspond to actions that may be applied to the selection of text 2502.
  • a finger of a user hand 912 is shown touching the flexible touchscreen element 140 at a location of the one or more shortcut overlays 2602.
  • the provision of the one or more shortcut overlays at the fold region may provide the user with easy access to common text editing actions such as copy, paste, and formatting options (such as bold, italic, and underline) by simply selecting a section of text to activate the one or more shortcut overlays at the fold region. Once the overlay is activated, the user may simply choose one of the one or more shortcut overlays representing a desired action in order to adjust the text as needed. In this way, a user may quickly and efficiently edit text without having to navigate through multiple menus or use keyboard shortcuts.
  • the one or more shortcut overlays may comprise the most relevant and useful shortcuts, making the editing process even more efficient and intuitive.
  • FIGs. 27-28 illustrate another example implementation of the method 2400 of FIG. 24.
  • FIG. 27 illustrates a foldable touchscreen device 100 providing a first display of a design application, in accordance with examples of the present disclosure.
  • the foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135.
  • the foldable touchscreen device 100 is shown executing a design application.
  • a value input box 2702 is shown displayed at an upper right portion of the flexible touchscreen element 140.
  • a finger of a user hand 912 is shown touching an element of the value input box 2702.
  • a dynamic slider may be displayed in the fold region 135, as will be described with reference to FIG. 28.
  • FIG. 28 illustrates that, following activation of the value input box 2702, a dynamic slider 2802 is displayed at the fold region 135. A finger of a user hand 912 is shown touching the dynamic slider 2802.
  • the dynamic slider 2802 at the fold region 135 may only be activated when the value input box 2702 is selected, ensuring that the dynamic slider 2802 is only displayed in the fold region 135 when needed by a user.
  • Other context-specific action shortcuts may be similarly displayed in the fold region 135, depending on current user selections and/or currently activated functions of the application. As a result, user interaction with a design application may be efficient and intuitive.
  • the fold region, or “hinge” of a foldable touchscreen device may be exploited to provide for additional user utility and enjoyment.
  • Embodiments disclosed herein describe the provision, at the hinge, of a dynamic portal of shortcuts which can adapt to a users’ input and to system and/or application events. These shortcuts may reduce or eliminate the need to click multiple steps within multiple menus in order to perform commonly executed tasks.
  • Embodiments described herein may facilitate WindowsTM management and multitasking, and may be used in a variety of application settings, including presentation applications (such as PowerpointTM), e-reading applications, video applications, camera applications, video game applications, and text editing applications.
  • presentation applications such as PowerpointTM
  • e-reading applications video applications, camera applications, video game applications, and text editing applications.
  • the embodiments described herein may be implemented in any combination.
  • a single device may be configured to implement any one or more or all embodiments in any combination.
  • the present disclosure is described, at least in part, in terms of methods, a person of ordinary skill in the art will understand that the present disclosure is also directed to the various components for performing at least some of the aspects and features of the described methods, be it by way of hardware components, software or any combination of the two. Accordingly, the technical solution of the present disclosure may be embodied in the form of a software product.
  • a suitable software product may be stored in a pre-recorded storage device or other similar non-volatile or non-transitory computer readable medium, including DVDs, CD-ROMs, USB flash disk, a removable hard disk, or other storage media, for example.
  • the software product includes instructions tangibly stored thereon that enable a processing device (e.g., a personal computer, a server, or a network device) to execute examples of the methods disclosed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems and methods are disclosed for triggering a first action based on user interactions with a fold region of a foldable touchscreen device. A foldable touchscreen device includes a flexible touchscreen element having a fold region separating first and second partitions of the flexible touchscreen element. The foldable touchscreen device includes a processor and a memory. The foldable touchscreen device may be caused to: display a first object in the first partition; detect, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detect a path of the drag gesture passing near or through a first one of one or more shortcut overlays displayed in the fold region; and trigger a first action corresponding to the first one of the one or more shortcut overlays.

Description

FOLDABLE DEVICE HINGE INTERACTION
FIELD
[0001] This disclosure relates generally to foldable touchscreen devices, and more specifically to systems and methods for providing functionality at the fold region of foldable, capacitive touchscreen devices.
BACKGROUND
[0002] Many foldable touchscreen devices are currently available, including smartphones, tablets, and laptops. As use of these devices becomes more prevalent, the demand for more functional space on these devices is increasing.
[0003] Although many of these devices now feature one continuous flexible capacitive touchscreen, the potential functionality of this feature has not been exploited. For example, the range of touch gestures on contemporary devices typically is limited to touch gestures on a single flat portion of the screen. Touch gestures utilizing the fold region (also known as the hinge) are not widely available and have been minimally explored.
[0004] Foldable touchscreen devices can be complex to use for multitasking, requiring the use of multiple controls and gestures to switch between applications (apps) and windows. Many foldable touchscreen devices struggle with windows management, making it difficult to switch between screens or to make windows full-screen.
[ 0005 ] With respect to smartphone devices in particular, touchscreen interaction at the fold region (also known as the hinge) has not been realized.
[0006] Improvements to the field are desired.
SUMMARY
[0007] A foldable touchscreen device, which may be a laptop, touchpad, e-reader, or a foldable smartphone, for example, may include a flexible display. A flexible display is a display that may be rolled without the displayed image or text being distorted. A foldable touchscreen device may include a flexible display that spans the fold region, or “hinge”, as well as the touchscreen partitions on adjacent sides of the fold region. As a result, the fold region may receive touch input, and may allow the touchscreen device to be adaptable to different usage scenarios. While it is often overlooked by manufacturers, who may minimize the hinge radius to prioritize other features such as the size of the touchscreen partitions and/or the overall design of the device, the hinge is an important element that can enhance user experience.
[0008] One potential use for the hinge is as a natural divider between the touchscreen partitions on a foldable device. In this capacity, the hinge can be useful for multitasking, as the adjacent touchscreen partitions (on either side of the hinge) can be used to display different applications or tasks side-by-side. The hinge can also be used as a bridge between the two touchscreen partitions, allowing for seamless transitions between different applications or tasks.
[0009] In addition to its practical uses, the hinge has unique characteristics that can be leveraged for accessibility purposes. For example, because the hinge is a physical feature that can be easily navigated by touch, it can be used to provide users with visual impairments with a more intuitive way to interact with a foldable touchscreen device.
[0010] Despite the potential benefits of the hinge, this region has been largely ignored in the design and interaction of foldable touchscreen devices. The placement of user interface (UI) elements at or near the hinge area is often avoided due to the possibility of distortion.
[0011] Embodiments of the present disclosure describe intuitive and novel interaction techniques for foldable touchscreen devices. In some examples, these techniques provide a dynamic portal of shortcuts that can adapt to user input and to system/application events, enabling the execution of frequently used tasks and commands without the need for navigating multiple menus.
[0012] Embodiments of the present disclosure may be used for various purposes, including windows management, shortcuts, and multitasking. In some examples, these functions may be facilitated more efficiently and effectively than current approaches.
[0013] In accordance with an aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen device further comprises a processor, and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display a first object in the first partition; detect, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detect a path of the drag gesture passing near or through a fust one of one or more shortcut overlays displayed in the fold region; and trigger a first action corresponding to the first one of the one or more shortcut overlays.
[0014] In some implementations, the foldable touchscreen device is further caused to, prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays: detect, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and display, at the fold region, the one or more shortcut overlays.
[0015] In some implementations, the one or more shortcut overlays is a windows management overlay.
[0016] In some implementations, the foldable touchscreen device is further caused to: detect a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, remove the one or more shortcut overlays from display in the fold region.
[0017] In some implementations, the one or more shortcut overlays is an app multiplier overlay.
[0018] In some implementations, the foldable touchscreen device is further caused to: detect the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and trigger a second action corresponding to the second one of the one or more shortcut overlays.
[0019] In accordance with another aspect of the present application, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of an e-reader application: detect a drag gesture in a first direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and insert a virtual bookmark at a currently displayed location of an electronic document.
[0020] In some implementations, the drag gesture begins at or near a first edge of the fold region.
[0021] In some implementations, the foldable touchscreen device is further caused to: detect a drag gesture in a second direction along the fold region substantially opposite to the first direction; and remove the virtual bookmark.
[0022] In accordance with yet another aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display content using full screen mode; detect a drag gesture in a first direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and display the content using dual screen mode.
[0023] In accordance with still yet another aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a first touch gesture at the fold region; and display a defined region containing application shortcuts.
[0024] In accordance with still yet another aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of a particular application: display, at the fold region, one or more application control overlays defined by the particular application; detect a touch gesture at a location of the fold region corresponding to one of the one or more application control overlays; and trigger a first action corresponding to the one of the one or more application control overlays.
[0025] In accordance with still yet another aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a grip posture of the foldable touchscreen device; in response to detecting the grip posture, display, at the fold region, a thumb input overlay; detect a touch gesture at a location of the fold region corresponding to the thumb input overlay; and trigger a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
[0026] In some implementations, the foldable touchscreen device is executing a video game application and the first action is a video game control action.
[0027] In some implementations, the foldable touchscreen device is executing a camera application and the first action is a camera control action.
[0028] In accordance with still yet another aspect of the present disclosure, there is provided a foldable touchscreen device comprising a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions. The touchscreen element is foldable at the fold region. The foldable touchscreen element further comprises a processor and a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display one or more user modifiable objects; receive a selection of one of the one or more user modifiable objects; display, at the fold region, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modify the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays. [0029] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: displaying a fust object in a first partition of a flexible touchscreen element; detecting, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detecting a path of the drag gesture passing near or through a fold region of the flexible touchscreen element a first one of one or more shortcut overlays displayed in the fold region; and triggering a first action corresponding to the first one of the one or more shortcut overlays.
[0030] In some implementations, the method further comprises: prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays, detecting, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and displaying, at the fold region, the one or more shortcut overlays.
[0031] In some implementations, the one or more shortcut overlays is a windows management overlay.
[0032] In some implementations, the method fiirther comprises: detecting a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, removing the one or more shortcut overlays from display in the fold region.
[0033] In some implementations, the one or more shortcut overlays is an app multiplier overlay.
[0034] In some implementations, the method fiirther comprises: detecting the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and triggering a second action corresponding to the second one of the one or more shortcut overlays.
[0035] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: detecting a drag gesture in a first direction along a fold region of a flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and inserting a virtual bookmark at a currently displayed location of an electronic document.
[0036] In some implementations, the drag gesture begins at or near a first edge of the fold region. [0037] In some implementations, the method further comprises: detecting a drag gesture in a second direction along the fold region substantially opposite to the first direction; and removing the virtual bookmark.
[0038] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: displaying content on a flexible touchscreen element using full screen mode; detecting a drag gesture in a first direction along a fold region of the flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and displaying the content using dual screen mode.
[0039] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: detecting a first touch gesture at a fold region of a flexible touchscreen element; and displaying a defined region containing application shortcuts.
[0040] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: displaying, at a fold region of a flexible touchscreen element, one or more application control overlays defined by an application being executed; detecting a touch gesture at a location of the fold region corresponding to one of the one or more application control overlays; and triggering a first action corresponding to the one of the one or more application control overlays.
[0041] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: detecting a grip posture of a foldable touchscreen device, the foldable touchscreen device have a flexible touchscreen element; in response to detecting the grip posture, displaying, at a fold region of the flexible touchscreen element, a thumb input overlay; detecting a touch gesture at a location of the fold region corresponding to the thumb input overlay; and triggering a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
[0042] In some implementations, the foldable touchscreen device is executing a video game application and the first action is a video game control action.
[0043] In some implementations, the foldable touchscreen device is executing a camera application and the first action is a camera control action. [0044] In accordance with still yet another aspect of the present disclosure, there is provided a computer-implemented method comprising: displaying one or more user modifiable objects on a flexible touchscreen element; receiving a selection of one of the one or more user modifiable objects; displaying, at a fold region of the flexible touchscreen element, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modifying the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
BRIEF DESCRIPTION OF THE DRAWINGS
[0045] Reference will now be made, by way of example, to the accompanying drawings which show example embodiments of the present disclosure, and in which:
[0046] FIG. 1 illustrates a front view of a first example foldable touchscreen device, in accordance with examples of the present disclosure;
[0047] FIG. 2 is a high-level operation diagram of an example computing system, in accordance with examples of the present disclosure;
[0048] FIG. 3 depicts a simplified organization of software components that may be stored in memory of the example computing system of FIG. 2, in accordance with examples of the present disclosure;
[0049] FIG. 4 is a flowchart of an example method for triggering a first action, in accordance with examples of the present disclosure;
[0050] FIG. 5 illustrates three instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure;
[0051] FIG. 6 is a flowchart of an example method for triggering a second action, in accordance with examples of the present disclosure; [0052] FIG. 7 illustrates a first four instances in connection with the example method for triggering a second action, in accordance with examples of the present disclosure;
[0053] FIG. 8 illustrates a second four instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure;
[0054] FIG.s 9A-9E illustrate a first five instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure;
[0055] FIG. 10 illustrates second five instances in connection with the example method for triggering a first action, in accordance with examples of the present disclosure;
[0056] FIG. 11 is a flowchart of an example method for inserting and removing a virtual bookmark at a currently displayed location of an electronic document, in accordance with examples of the present disclosure;
[0057] FIG. 12 illustrates a foldable touchscreen device providing a first display of an electronic document, in accordance with examples of the present disclosure;
[0058] FIG. 13 illustrates a foldable touchscreen device providing a second display of an electronic document, in accordance with examples of the present disclosure;
[0059] FIG. 14 is a flowchart of an example method of displaying content using dual screen mode, in accordance with examples of the present disclosure;
[0060] FIG. 15 illustrates a foldable touchscreen device displaying content using fullscreen mode, in accordance with examples of the present disclosure;
[0061] FIG. 16 illustrates a foldable touchscreen device displaying content using dual screen mode, according to examples of the present disclosure;
[0062] FIG. 17 is a flowchart of an example method for displaying a defined region containing application shortcuts, in accordance with examples of the present disclosure; [0063] FIG. 18 is a flowchart of an example method for triggering a first action corresponding to one or more shortcut overlays, in accordance with examples of the present disclosure;
[0064] FIG. 19 illustrates a foldable touchscreen device providing a first display of a video application, in accordance with examples of the present disclosure;
[0065] FIG. 20 illustrates a foldable touchscreen device providing a second display of a video application, in accordance with examples of the present disclosure;
[0066] FIG. 21 is a flowchart of an example method for triggering a first action corresponding to a touch gesture and a location corresponding to a thumb input overlay, in accordance with examples of the present disclosure;
[0067] FIG. 22 illustrates a foldable touchscreen device displaying a video game application, in accordance with examples of the present disclosure;
[0068] FIG. 23 illustrates a foldable touchscreen device displaying a camera application, in accordance with examples of the present disclosure;
[0069] FIG. 24 is a flowchart of an example method of modifying one of one or more user modifiable objects, in accordance with examples of the present disclosure;
[0070] FIG. 25 illustrates a foldable touchscreen device providing a first display of a word processing application, in accordance with examples of the present disclosure;
[0071] FIG. 26 illustrates a foldable touchscreen device providing a second display of a word processing application, in accordance with examples of the present disclosure;
[0072] FIG. 27 illustrates a foldable touchscreen device providing a first display of a design application, in accordance with examples of the present disclosure; and
[0073] FIG. 28 illustrates a foldable touchscreen device providing a second display of a design application, in accordance with examples of the present disclosure. Similar reference numerals may have been used in different figures to denote similar components.
DETAILED DESCRIPTION
[0074] Embodiments described herein may operate on a variety of foldable touchscreen devices, such as dual screen laptops, foldable laptops, standard laptops, tablets, smart phones, and the like.
[0075] In this disclosure the term “computing system” refers to an electronic device having computing capabilities. Examples of computing systems include but are not limited to: personal computers, laptop computers, tablet computers (“tablets”), smartphones, surface computers, augmented reality gear, and the like.
[0076] In this disclosure, the terms “touchscreen element” and “touchscreen” refer to a combination of a display together with a touch sensing system that is capable of acting as an input device by receiving a touch input. Non-limiting examples of touchscreen displays are: surface capacitive touchscreens and projected capacitive touchscreens.
[0077] In this disclosure, the term “touchscreen device” refers to a computer system having a touchscreen element.
[0078] In this disclosure, the term “application” refers to a software program comprising of a set of instructions that can be executed by a processing device of an electronic device.
[0079] FIG. 1 illustrates an example foldable touchscreen device 100, which is an example operating environment of an example embodiment. As shown, the example foldable touchscreen device 100 includes a flexible touchscreen element 140. The flexible touchscreen element 140 may be a foldable touchscreen element that is foldable at a fold region 135. The touchscreen element 140 may be operable to render content and to sense touch thereupon. As noted, the touchscreen element 140 may also be described as a touchscreen 140. The touchscreen element 140 may implement one or more touchscreen technologies. For example, the touchscreen element 140 may be a capacitive touchscreen, (e.g., surface capacitive, projected capacitive, mutual capacitive, self-capacitive), a resistive touchscreen, a surface acoustic wave (SAW) touchscreen, etc.
[0080] The example foldable touchscreen device 100 includes a body 105, which may be formed of plastic, glass, fiber, stainless steel, aluminum, other suitable materials, or a combination of any of these materials. The body 105 encloses multiple components of the example foldable touchscreen device 100 including a processor and a memory. The body 105 is configured to house the flexible touchscreen element 140. The body may be described as comprising a fust portion 110 and a second portion 115 connected through a fold edge 120.
[0081] The flexible touchscreen element 140 provides for the folding of the example foldable touchscreen device 100 into various physical configurations including a flat configuration. The flexible touchscreen element 140 can fold about the fold edge 120 so that the first portion 110 and the second portion 115 of the example foldable touchscreen device 100 move towards and/or away from each other. The fold edge 120 may be manufactured from a bendable, flexible material such as flexible polymer.
[0082] The flexible touchscreen element 140 of the example foldable touchscreen device 100 has three touchscreen partitions: a first partition 125, a second partition 130, and a fold region 135 separating the first and second partitions 125, 130. The fold region 135 may be also known as the hinge. As shown, the fold region 135 may be defined by a first border 150 along an edge of the first partition 125 and a second border 155 along an edge of the second partition 130. The first and second borders 150, 155 may not be physical borders, but rather may be logically defined and may be dynamically shifted (or omitted). It should be appreciated that the terms “first” and “second” are not intended to be limiting. Further, the designation of a touchscreen partition as “first” or “second” may be arbitrary; a touchscreen partition that is designated as “first” in one instance may be designated as “second” in another instance, and vice versa. Similarly, the designation of “first” and “second” borders may be arbitrary.
[0083] It will be understood that when the example foldable touchscreen device 100 is in an unfolded state, which means that the example foldable touchscreen device 100 is a flat device with the fold angle 145 being 180 degrees, the first partition 125, the second partition 130, and the fold region 135 are not distinguishable and form a continuous display. [0084] Although FIG. 1 illustrates a laptop computer, the example foldable touchscreen device 100 may be a smartphone, a tablet, and/or other similar electronic device. The example foldable touchscreen device 100 may be a type of computing system within the scope of the present disclosure.
[0085] FIG. 2 is a high-level operation diagram of an example computing system 200, in accordance with examples of the present disclosure. In at least some embodiments, the example computing system 200 may be exemplary of the example touchscreen device 100 (FIG. 1) and is not intended to be limiting.
[0086] The example computing system 200 includes a variety of components. For example, as illustrated, the example computing system 200 may include a processor 302, an input/output (I/O) interface 304, a network interface 306, a storage unit 378 and a memory 380. As illustrated, the foregoing example components of the computing system 200 are in communication over a bus 308. The bus 308 is shown providing communication among the components of the computing system 200. The bus 308 may be any suitable bus architecture including, for example, a memory bus, a peripheral bus or a video bus.
[0087] The processor 302 may include one or more processors, such as a microprocessor, a microcontroller, an application-specific integrated circuit (ASIC), a field- programmable gate array (FPGA), a dedicated logic circuitry, or combinations thereof.
[0088] The network interface 306 may include one or more network interfaces for wired or wireless communication with a network (e.g., an intranet, the Internet, a peer-to-peer (P2P) network, a wide area network (WAN) and/or a local area network (LAN) or other node. The network interface 306 may include wired links (e.g., Ethernet cable) and/or wireless links (e.g., one or more antennas) for intra-network and/or inter-network communications.
[0089] The storage unit 378 may be one or more storage units, and may include a mass storage unit such as a solid state drive, a hard disk drive, a magnetic disk drive and/or an optical disk drive.
[0090] The I/O interface 304 may be one or more I/O interfaces, and may enable interfacing with one or more appropriate input devices, such as the touch panel 344, and/or one or more appropriate output devices, such as the touchscreen display 342. The touch panel 344 and the touchscreen display 342 form part of the touchscreen element 140.
[0091] The touch panel 344 may include a variety of touch sensors for sensing touch input, which may depend on the touch sensing modality used by the touchscreen element 140 (e.g., capacitive sensors).
[0092] The computing system 200 may include one or more memories 380, which may include a volatile (e.g. random access memory (RAM)) and non-volatile or non-transitory memories (e.g., a flash memory, magnetic storage, and/or a read-only memory (ROM)). The non-transitory memory (ies) of memories 380 store programs that include software instructions for execution by the processor 302, such as to carry out examples described in the present disclosure. In example embodiments, the programs include software instructions for implementing an operating system (OS) and software applications.
[0093] In some examples, the memory 380 may include software instructions of the computing system 200 for execution by the processor 302 to carry out the operations described in this disclosure. In some other examples, one or more data sets and/or modules may be provided by an external memory (e.g., an external drive in wired or wireless communication with the computing system 200) or may be provided by a transitory or non-transitory computer - readable medium. Examples of non-transitory computer readable media include a RAM, a ROM, an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a CD-ROM, or other portable memory storage.
[0094] FIG. 3 depicts a simplified organization of software components that may be stored in memory 380 of the example computing system 200 (FIG. 2). As illustrated, these software components include application software 350 and an operating system (OS) 310.
[0095] The OS 310 is software. The OS 310 allows the application software 350 to access the processor 302, the memory 380, the I/O interface 304, the network interface 306, and the storage unit 378 (FIG. 2). The OS 310 may be, for example, Apple™ iOS™, Android™, Microsoft™ Windows™, Google™ ChromeOS™, or the like.
[0096] The application software 350 adapts the example computing system 200 (FIG. 2), in combination with the OS 310, to operate as a device performing a particular function. For example, the application software 350 may adapt the example computing system 200 (FIG. 2) to perform fold angle determination.
[0097] FIG. 4 is a flowchart of a method 400 for triggering a first action, in accordance with examples of the present disclosure. The method 400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[0098] At the operation 410, the computing system displays a first object in the first partition of a flexible touchscreen element.
[0099] Returning again to FIG. 4, after the operation 410, the operation 420 is next.
[OO1OO] At the operation 420, the system detects, at a location of the first partition corresponding to the first object, an initialization of a drag gesture.
[OO1O1] At the operation 430, the system detects a path of the drag gesture passing near or through the fold region at a first one of the one or more shortcut overlays.
[00102] At the operation 440, the system triggers a first action corresponding to the first one of the one or more shortcut overlays.
[00103] Reference is now made to FIG. 5, which illustrates three instances 502, 504, 506 of a method for triggering a first action, in accordance with examples of the present disclosure. For example, the three instances 502, 504, 506 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 400 of FIG. 4. Each of the three instances 502, 504, and 506 shows a foldable touchscreen device 100 having a flexible touchscreen element 140 comprising a first partition 125, a second partition 130, and a fold region 135. Four shortcut overlays 508 comprising are displayed at the fold region 135 of each foldable touchscreen device 100 illustrated at instances 502, 504, and 506. It may be appreciated that the first partition 125 and second partition 130 indicated in FIG. 5 may correspond to the second partition 130 and first partition 125 shown in FIG. 1; as previously mentioned, the designation of “first” and “second” may be arbitrary and are not intended to be limiting. [00104] At illustrated by the instance 502, a first circle 510 is shown displayed at the fust partition 125 of the foldable touchscreen device 100. The first circle 510 is a first object.
[00105] As illustrated by the instance 504, the first circle 510 has been selected and dragged along the first partition 125 and approaches the fold region 135. Specifically, the first circle 510 is shown approaching the first shortcut overlay of the four shortcut overlays 508 at the fold region 135. The four shortcut overlays 508 may be described as a dynamic hinge shortcut overlay. The first shortcut overlay of the dynamic hinge shortcut overlay displays the word “copy” and represents a copy function shortcut.
[00106] The fold region 135 (or “hinge”) may act as a natural bridge region between the first and second partitions 125, 130 of a flexible touchscreen element 140 of a foldable touchscreen device 100. Providing a contextual shortcut overlay, such as a dynamic hinge shortcut overlay, as a portal at the hinge may allow users to transform an object (such as the first circle 510) between the first and second partitions 125, 130.
[00107] As the first circle 510 is dragged towards the fold region 135, the foldable touchscreen device 100 may detect an initialization of a drag gesture at a location of the first partition 125 corresponding to the first circle 510.
[00108] As illustrated by the instance 506, the first circle 510 has been dragged through the fold region 135 at the first shortcut overlay and is now displayed at the second partition 130. The first shortcut overlay displays the word “copy” and represents a copy function shortcut.
[00109] As the first circle 510 is dragged, the system detects a path of the drag gesture passing near or through a first one of one or more shortcut overlays. As a result, the system triggers a first action corresponding to the first one of the one or more shortcut overlays (in this example, a “copy” function corresponding to the first shortcut overlay).
[00110] A second circle 512 now appears at the first partition 125. The second circle 512 is a second object. The second circle 512 is a copy of the first circle 510.
[00111] In this way, a “copy” function may be completed in a single step by dragging an object towards the fold region 135 (hinge) to access a dynamic hinge shortcut overlay. It should be understood that, although this example illustrates the path of the drag gesture crossing the fold region 135 from the first partition 125 to the second partition 130, this is not intended to be limiting. For example, a path for a drag gesture that starts in the first partition 125, approaches or touches the desired shortcut overlay the fold region 135 and returns to the first partition 125 (i. e., without crossing into the second partition 130) may be sufficient. The “copy” function may be completed using a Microsoft™ Office application or a whiteboard application, among other possibilities. This is an improvement over contemporary methods requiring multiple steps, such as selecting components and choosing an appropriate command from a pop-up menu. The example of FIG. 5 may be particularly useful when a user is working with an immersive or dual-screen view that utilizes both first and second partitions 125, 130 and needs to frequently copy and paste content between both first and second partitions 125, 130. Different actions may be performed, corresponding to different ones of the one or more shortcut overlays, depending on which shortcut overlay has been crossed or approached by the path of the drag gesture. Other actions may include various actions that may be performed on a selectable object, for example resize action, delete action, etc. and may depend on the software application being executed.
[00112] Returning again to FIG. 4, in some embodiments, the one or more shortcut overlays may be displayed at the fold region prior to the detection of the initialization of the drag gesture at the operation 420. In some embodiments, the one or more shortcut overlays may be displayed as a result of the system detecting the initialization of the drag gesture, or as a result of the system detecting, at a location of the first partition within a proximity of the fold region, the drag gesture.
[00113] Following the triggering of the first action at the operation 440, the system may detect a cessation of the drag gesture. In some embodiments, in response to detecting the cessation of the drag gesture, the system may remove the one or more shortcut overlays from display in the fold region.
[00114] FIG. 6 is a flowchart of an example method 600 for triggering a second action, in accordance with examples of the present disclosure. The method 600 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system. [00115] As shown, the method 600 of FIG. 6 comprises six operations 410, 420, 430, 440, 610, 620. The first four operations 410, 420, 430, 440 of the method 600 of FIG. 6 are the same as the operations 410, 420, 430, 400 of the method 400 of FIG. 4.
[00116] At the operation 410, the computing system displays a first object in the first partition of a flexible touchscreen element.
[00117] At the operation 420, the system detects, at a location of the first partition corresponding to the first object, an initialization of a drag gesture.
[00118] At the operation 430, the system detects a path of the drag gesture passing near or through the fold region at a first one of the one or more shortcut overlays.
[00119] At the operation 440, the system triggers a first action corresponding to the fust one of the one or more shortcut overlays.
[00120] At the operation 610, the system detects the drag gesture crossing the fold region at a second one of the one or more shortcut overlays.
[00121] At the operation 620, the system triggers a first action corresponding to the fust one of the one or more shortcut overlays.
[00122] Reference is now made to FIG. 7, which illustrates a first four instances 502, 504, 506, 702 in connection with a method for triggering a second action, in accordance with examples of the present disclosure. For example, the four instances 502, 504, 506, 702 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 600 of FIG. 6. In this example, the first three instances 502, 504, 506 of FIG. 7 are the same as the three instances 502, 504, 506 of FIG. 5. It may be appreciated that the first partition 125 and second partition 130 indicated in FIG. 7 may correspond to the second partition 130 and first partition 125 shown in FIG. 1; as previously mentioned, the designation of “first” and “second” may be arbitrary and are not intended to be limiting. [00123] At illustrated by the instance 502, a first circle 510 is shown displayed at the fust partition 125 of the foldable touchscreen device 100. The first circle 510 is a first object.
[00124] As illustrated by the instance 504, the first circle 510 has been selected and dragged along the first partition 125 and approaches the fold region 135. Specifically, the first circle 510 is shown approaching the first shortcut overlay of the four shortcut overlays 508 at the fold region 135. The four shortcut overlays 508 may be described as a dynamic hinge shortcut overlay. The first shortcut overlay of the dynamic hinge shortcut overlay displays the word “copy” and represents a copy function shortcut.
[00125] The fold region 135 (or “hinge”) may act as a natural bridge region between the first and second partitions 125, 130 of a flexible touchscreen element 140 of a foldable touchscreen device 100. Providing a contextual shortcut overlay, such as a dynamic hinge shortcut overlay, as a portal at the hinge may allow users to transform an object (such as the first circle 510) between the first and second partitions 125, 130.
[00126] As the first circle 510 is dragged towards the fold region 135, the foldable touchscreen device 100 may detect an initialization of a drag gesture at a location of the first partition 125 corresponding to the first circle 510.
[00127] As illustrated by the instance 506, the first circle 510 has been dragged through the fold region 135 at the first shortcut overlay and is now displayed at the second partition 130. The first shortcut overlay displays the word “copy” and represents a copy function shortcut.
[00128] As the first circle 510 is dragged, the system detects a path of the drag gesture passing near or through a first one of one or more shortcut overlays. As a result, the system triggers a first action corresponding to the first one of the one or more shortcut overlays (in this example, a “copy” function corresponding to the first shortcut overlay).
[00129] A second circle 512 now appears at the first partition 125. The second circle 512 is a second object. The second circle 512 is a copy of the first circle 510.
[00130] Further, as illustrated by the instance 702, the first circle 510 has passed through the fold region 135 from the second partition 130 back to the first partition 125 and is now displayed at the first partition 125. The second circle 512 remains displayed at the second partition 125. The first circle 510 has passed through the fold region 135 at a second one of the four shortcut overlays 508. The second shortcut overlay corresponds to a changing color action. As a result of passing near or through the fold region 135 at the second one of the four shortcut overlays 508, a second action has been triggered, in this example the color of the first circle 510 has been changed.
[00131] It should also be understood that the path of the drag gesture may pass through or near any number (e.g., more than two) of shortcut overlays displayed in the fold region 135. For example, operations 610-620 of the method 600 of FIG. 6 may be repeated multiple times. In this way, dragging an object (such as the first circle 510 and/or the second circle 512) through or near the fold region 135 may provide for the execution of various commands. The various commands may be used in combination by dragging the object through or near the fold region 135 multiple times (corresponding to multiple shortcut overlays) within the same drag gesture. A repetitive hinge crossing gesture may execute a combination of commands to trigger actions. Examples of commands which may be executed by in this manner may include “save” and “full screen”.
[00132] A repetitive hinge crossing gesture may benefit a user working in a Microsoft Office application, a design application, a whiteboard application or other such applications on a foldable touchscreen device. For example, when creating an object (e.g., a chart) with repeated elements, the user may want to quickly duplicate content and assign new attributes such as color and position. As a further example, a user may drag a desired object through or near the fold region and perform a specific gesture to duplicate the object and assign new attributes in a single step, streamlining the process of creating objects (e.g., charts) with repeated elements, and improving the efficiency and convenience of the task.
[00133] FIG. 8 illustrates a second four instances 802, 804, 806, 808 in connection with a method for triggering a first action, in accordance with examples of the present disclosure. For example, the four instances 802, 804, 806, 808 may illustrate different operations of a foldable touchscreen device 100 in some example implementations of the method 400 of FIG. 4 or the method 600 of FIG. 6. Each of the four instances 802, 804, 806, 808 show a foldable touchscreen device 100 having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. In this example, the triggered action may be related to windows management. A windows management overlay 810 is displayed at the fold region 135 of each foldable touchscreen device 100 illustrated at the four instances 802, 804, 806, 808. The windows management overlay 810 comprises four shortcut overlays.
[00134] The four instances 802, 804, 806, 808 of FIG. 8 relate to use of the fold region 135 as a portal for system windows management assistance on a foldable touchscreen device 100.
[00135] At illustrated by the instance 802, a first window 812 (i.e., the fust object) is shown displayed at the first partition 125 of the flexible touchscreen element 140 of the foldable touchscreen device 100.
[00136] As illustrated by the instance 804, the first window 812 has been dragged along the first partition 125 and through the first windows management shortcut overlay of the windows management overlay 810 in the fold region 135. The windows management overlay 810 may be described as a dynamic hinge shortcut overlay. Each of the four shortcut overlays of the windows management overlay 810 may correspond to a different windows layout action. The first shortcut overlay of the dynamic hinge shortcut overlay represents an “expand” function shortcut. The first window 812 is shown expanded such that it occupies the entirety of the second partition 130.
[00137] In this way, an expand action may be completed in a single step by dragging an object towards the fold region 135 (hinge) to access a windows management shortcut overlay.
[00138] The third instance 806 illustrates the first window 812 after being dragged along the first partition 125 and through the second shortcut overlay of the windows management overlay 810 in the fold region 135. The second shortcut overlay of the dynamic hinge shortcut overlay represents a “semi-expand” action shortcut. The first window 812 is shown semi -expanded such that it occupies the leftmost half of the second partition 130.
[00139] In this way, a semi-expand action may be completed in a single step by dragging an object (such as the first window 812) towards the fold region 135 (hinge) to access the windows management overlay 810. [00140] The fourth instance 808 illustrates the first window 812 after being dragged along the first partition 125 and through the third shortcut overlay of the windows management overlay 810 in the fold region 135. The third shortcut overlay of the dynamic hinge shortcut overlay represents a “close” action shortcut. The first window 812 is shown having closed.
[00141] In this way, a “close” action may be completed in a single step by dragging an object towards the fold region (hinge) to access a corresponding windows management shortcut overlay.
[00142] As illustrated, the fold region 135 may be used as a portal for system window management assistance on a foldable touchscreen device 100. By dragging a window towards the hinge, a user may access a window management overlay. The windows management overlay may allow the user to easily rearrange the layout of windows on the device by dragging one or more windows across various layout commands of the windows management overlay.
[00143] The hinge portal provides a natural transition for managing windows across screens, addressing a common issue with foldable personal computers (PCs), which may encounter difficulty in moving windows across the hinge. The interactions described with respect to FIG. 8 may streamline the process of organizing windows and improve the overall efficiency of windows management on foldable touchscreen devices. Further, it should be understood that although FIG. 8 illustrates examples where a first action is triggered by dragging the window, in some examples the drag gesture may further pass through or near one or more additional shortcut overlays in the fold region, to cause triggering of one or more additional actions to be performed.
[00144] FIGs. 9A-9E illustrate a first five instances 902, 904, 906, 908, 910 in connection with an example method for triggering a first action, in accordance with examples of the present disclosure. For example, the five instances 902, 904, 906, 908, 910 may illustrate operations of a foldable touchscreen device 100 in some example implementations of the method 400 of FIG. 4 or the method 600 of FIG. 6. Each of the five instances 902, 904, 906, 908, 910 show a foldable touchscreen device 100 having flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in a “book mode” orientation in which the fold region 135 is oriented vertically instead of horizontally. The flexible touchscreen element 140 is shown displaying online content via a web browser apphcation being executed on the foldable touchscreen device 100. In this example, the triggered action may be related to browser an onhne website.
[00145] FIG. 9A illustrates instance 902, which shows the display of a list view of objects at the first and second partitions 125, 130. Each of the objects is represented by a displayed image and may represent a hyperlink.
[00146] FIG. 9B illustrates instance 904, which shows the display of a list view of objects at the first and second partitions 125, 130. Instance 904 additionally illustrates the finger of a user hand 912 initiating a drag operation beginning at a first one of the objects at the first partition 125. Three shortcut overlays 914 are displayed at the fold region 135. The three shortcut overlays 914 may correspond to shortcuts corresponding to an “App Launcher” feature and/or to a “Parallel View” feature. An “App Launcher” feature and/or a “Parallel View” feature may allow a user to browse a list view of objects at the second partition 130, while opening detailed views of one or more of the objects at the first partition 125, among other functions. The three shortcut overlays 914 may correspond to actions for opening detailed views of one or more selected objects. The display of the three shortcut overlays 914 may result from a detection of a drag gesture at a location of the first partition 125 within a proximity of the fold region 135. In other examples, the shortcut overlays 914 may be displayed prior to the detection of the drag gesture, or may be displayed after the drag gesture has been initiated but prior to the drag gesture being in proximity of the fold region 135. The path of the drag operation is shown extending from the first one of the objects at the first partition 125 toward a first shortcut overlay of the three shortcut overlays 914 at the fold region 135.
[00147] FIG. 9C illustrates instance 906, which shows the foldable touchscreen device 100 after the execution of the action corresponding to the first shortcut overlay. The second partition 130 displays a list view of objects. Each of the objects is represented by a displayed image and may represent a hyperlink. The first partition 125 displays a detailed view corresponding to the first one of the objects.
[00148] FIG. 9D illustrates instance 908, which shows the display of a list view of objects at the second partition 130, and illustrates the display of a detailed view of three objects at the first partition 125. Instance 908 additionally illustrates the finger of the user hand 912 initiating a drag operation beginning at a first one of the objects at the first partition 125. The fold region 135 displays four shortcut overlays 916. The four shortcuts overlays 916 may correspond to browser actions such as “unlink”, “close”, flip”, and/or “go back”. The display of the four shortcut overlays 916 may result from a detection of a drag gesture at a location of the fust partition 125 within a proximity of the fold region 135. In other examples, the shortcut overlays 916 may be displayed prior to the detection of the drag gesture, or may be displayed after the drag gesture has been initiated but prior to the drag gesture being in proximity of the fold region 135. The path of the drag operation is shown extending from the first one of the objects at the first partition 125 toward the fourth shortcut overlay of the four shortcut overlays 916 at the fold region 135.
[00149] The fourth shortcut overlay of the four shortcut overlays 916 corresponds to a “close” action.
[00150] FIG. 9E illustrates instance 910, which shows the foldable touchscreen device 100 after the execution of the “close” action. The first partition 125 displays a detailed view of two objects, the detailed view of the first one of the objects having been removed as a result of the “close” action. The second partition 130 displays a list view of objects. The list view of objects includes the first one of the objects . The four shortcut overlays 916 have been removed from the fold region 135, for example in response to detecting cessation of the drag gesture. In other examples, the shortcut overlays 916 may be displayed in the fold region 135 even in absence of a detected drag gesture.
[00151] As illustrated by FIGs. 9A-9E, the hinge may be used as a portal to optimize a “Parallel View” and/or “app launcher” feature of a browser application. In accordance with examples of the present disclosure, “Parallel View” and/or “app Launcher” features may be extended to foldable touchscreen devices, allowing users to easily manage multiple sub-pages at the same time.
[00152] FIGs. 10 A- 10E illustrate a second five instances 1002, 1004, 1006, 1008, 1010 in connection with a method for triggering a first action, in accordance with examples of the present disclosure. For example, the five instances 1002, 1004, 1006, 1008, 1010 may illustrate a sequence of operations of a foldable touchscreen device 100 in an example implementation of the method 400 of FIG. 4 or the method 600 of FIG. 6. Each of the five instances 1002, 1004, 1006, 1008, 1010 show a foldable touchscreen device 100 having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. In this example, the triggered action may be related to management of layers in, for example, a design application.
[00153] At illustrated by the instance 1002 shown in FIG. 10A, a first ordering of layers 1012 is displayed at the first partition 125 of the flexible touchscreen element 140 of the foldable touchscreen device 100. A finger of a user hand 912 is shown touching a first layer of the ordering of layers 1012. The first layer is the front most layer of the ordering of layers 1012. The first layer is an object. The finger of the user hand 912 is shown initializing a drag gesture of the first layer towards the fold region 135.
[00154] As illustrated by the second instance 1004 shown in FIG. 10B, as a result of a detection of the drag gesture at a location of the first partition 125 within a proximity of the fold region 135, one or more shortcut overlays 1014 are displayed at the fold region 135. The finger of the user hand 912 is shown touching a dragged copy of the first layer at the location of one of the one or more shortcut overlays 1014. The one of the one or more shortcut overlays 1014 corresponds to a “Layer” action shortcut.
[00155] As illustrated by the third instance 1006 shown in FIG. 10C, the finger of the user hand 912 is shown continuing the drag gesture along the fold region 135 at the “Layer” action shortcut to choose and effect an adjustment of the ordering of the ordering of layers 1012.
[00156] As illustrated by the instance 1008 shown in FIG. 10D, the first layer is shown displayed at the fold region 135 at the location of the finger of the user hand 912 performing the drag gesture. The first layer is shown being dragged towards the remaining layers of the first ordering of layers 1012.
[00157] As illustrated by the instance 1010 shown in FIG. 10E, the first layer of the first ordering of layers 1012 is shown being dragged to the backmost position of the remaining layers of the first ordering of layers 1012. The finger of the user hand 912 is shown at the location of the first layer.
[00158] As shown by FIGs. 10A-10E, examples of the present disclosure may provide the ability to adjust the position/ordering of layers and other objects in a design application (e. g. , a Powerpoint™ application) . Examples of the present disclosure may avoid a requirement to use a combination of keyboard shortcut keys, right-click menus, and/or ribbon menus to perform such actions, which may be time-consuming and inconvenient. Examples of the present disclosure may allow a user to simply drag an object to a layer command at the fold region and then slide to adjust a layer order, then relocate the object.
[00159] FIG. 11 is a flowchart of an example method 1100 for inserting and removing a virtual bookmark at a currently displayed location of an electronic document, in accordance with examples of the present disclosure. The method 1100 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 1110 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00160] At the operation 1110, the system detects a drag gesture in a first direction along the fold region. The first direction may be substantially parallel to a longitudinal axis of the fold region.
[00161] At the operation 1120, the system inserts a virtual bookmark at a currently displayed location of the electronic document.
[00162] At the operation 1130, the system removes the virtual bookmark from the currently displayed location of the electronic document.
[00163] Reference is made to FIG. 12, which illustrates a foldable touchscreen device 100 providing a first display of an electronic document 1204, in accordance with examples of the present disclosure. FIG. 12 may illustrate an example partial implementation of the method 1100 of FIG. 11. The foldable touchscreen device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in “book mode” orientation. A longitudinal axis 1202 of the fold region 135 is shown extending through the illustration of the foldable touchscreen device 100. The flexible touchscreen element 140 is shown displaying an electronic document 1204 (e.g., an electronic book). A first page of the electronic document 1204 is shown displayed at the first partition 125 and a second page of the electronic document 1204 is shown displayed at the second partition 130. A finger of a user hand 912 is shown initiating a drag gesture at the fold region 135. The finger of the user hand 912 is shown touching the fold region 135 near the top of the flexible touchscreen element 140, and a perforated downward facing arrow 1206 is displayed to indicate a downward direction of the drag gesture.
[00164] Thus, a drag gesture is shown as being initiated in a first direction along the fold region 135. As shown, the first direction is substantially parallel to the longitudinal axis 1202 of the fold region 135. As a result of the drag gesture being detected substantially parallel to the longitudinal axis 1202 of the fold region 135, and during displaying of the electronic document 1204, a virtual bookmark may be inserted at the currently displayed position of the electronic document 1204.
[00165] The use of a drag gesture to insert a bookmark in an electronic document may be intuitive and natural for a user, and may provide a seamless and cohesive experience.
[00166] In a similar manner, a bookmark may be removed from the electronic document. The system may, in response to detecting a drag gesture in a second direction along the fold region substantially opposite the first direction, remove the virtual bookmark.
[00167] Reference is now made to FIG. 13, which illustrates a foldable touchscreen device 100 providing a second display of an electronic document 1204, in accordance with examples of the present disclosure. FIG. 13 may illustrate an example partial implementation of the method 1100 of FIG. 11. The foldable electronic device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in a “book mode” orientation. A longitudinal axis 1202 of the fold region 135 is shown extending through the illustration of the foldable touchscreen device 100. The flexible touchscreen element 140 is shown displaying an electronic document 1204. A first page of the electronic document 1204 is shown displayed at the first partition 125 and a second page of the electronic document 1204 is shown displayed at the second partition 130. A virtual bookmark 1302 is displayed along an upper portion of the fold region 135. For example, the virtual bookmark 1302 may have been inserted as a result of the operations illustrated in FIG. 12. A finger of a user hand 912 is shown initiating a drag gesture at the fold region 135. The finger of the user hand 912 is shown touching the fold region 135 near the bottom of the virtual bookmark 1302, and a perforated upward facing arrow 1304 is displayed to indicate an upward direction of the drag gesture. [00168] Thus, a drag gesture is shown as being initiated in a second direction, substantially opposite to the first direction, along the fold region 135. As shown, the second direction is substantially parallel to the longitudinal axis 1202 of the fold region 135.
[00169] Following the execution of the drag gesture, the system may remove the virtual bookmark 1302.
[00170] FIG. 14 is a flowchart of an example method 1400 of displaying content using dual screen mode, in accordance with examples of the present disclosure. The method 1400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 1410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00171] At the operation 1410, the system displays content using full screen mode.
[00172] At the operation 1420, the system detects a drag gesture in a first direction along the fold region.
[00173] At the operation 1430, the system displays the content using dual screen mode.
[00174] Reference is now made to FIGs. 15 and 16, which illustrate an example implementation of the method 1400 of FIG. 14. FIG. 15 illustrates a foldable touchscreen device 100 displaying content using immersive fullscreen mode, in accordance with examples of the present disclosure. The foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The flexible touchscreen element 140 is shown displaying content in full screen mode. Two fingers of a user hand 912 are shown initiating a drag gesture at the fold region 135. A perforated line 1502 having an arrow is displayed at the fold region 135 indicating the direction of the drag gesture.
[00175] FIG. 16 shows that, as a result of the detected drag gesture performed in FIG. 15, content is now displayed on the flexible touchscreen element 140 in dual screen mode.
[00176] FIG. 17 is a flowchart of an example method 1700 for displaying a defined region containing application shortcuts, in accordance with examples of the present disclosure. The method 1700 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00177] At the operation 1710, the system detects a first touch gesture at the fold region.
The first touch gesture may be a multi-finger gesture, such as a two-finger tap.
[00178] At the operation 1720, the system displays a defined region containing application shortcuts. For example, in response to detecting the first touch gesture at the fold region, a defined region containing application shortcuts, such as an “app dock” may be triggered to appear at the fold region, allowing the user to easily open applications on either screen.
[00179] In this way, a touch gesture may provide a convenient way for users to access and launch applications without having to navigate through multiple menus or screens. This example may be particularly useful in foldable touchscreen devices that offer multitasking capabilities, such as foldable laptops or tablets. By providing a simple and intuitive gesture for activating an app dock, users may easily and quickly access the applications they need, improving their productivity and efficiency.
[00180] FIG. 18 is a flowchart of an example method 1800 for triggering a first action corresponding to one or more application shortcuts. The method 1800 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00181] At the operation 1810, the system displays, at the fold region, one or more application control overlays. The application control overlay(s) may provide for control of a currently executed application.
[00182] At the operation 1820, the system detects a touch gesture at a location of the fold region corresponding to the one or more application control overlays. The touch gesture may be a multi-finger gesture. [00183] At the operation 1830, the system triggers a first action corresponding to the one or more application control overlays.
[00184] Reference is now made to FIG. 19, which illustrates a foldable touchscreen device providing a first display of a video application, in accordance with examples of the present disclosure. FIG. 19 illustrates an example implementation of the method 1800 of FIG. 19. The foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The flexible touchscreen element 140 is shown displaying content in dual screen mode, and the first partition 125 is shown displaying a currently executed video player application. First and second application control overlays 1902, 1904 are displayed at the fold region 135. The first application control overlay 1902 is a brightness control overlay and the second application control overlay 1904 is avohime control overlay. A finger of a user hand 912 is shown touching the fold region 135 at a location of the display of the second application control overlay 1904.
[00185] For example, as a result of detecting a touch gesture at a location of the fold region 135 corresponding to the second application control overlay 1904, the application may increase, or decrease, the volume of the video player application. Similarly, as a result of detecting a touch gesture, which may be a multi-finger gesture, at a location of the fold region 135 corresponding to the first application control overlay 1902, the application may increase, or decrease, the brightness of the video player application.
[00186] The method 1800 of FIG. 18 may be performed when the foldable touchscreen device is in tent mode. FIG. 20 illustrates a foldable touchscreen device 100 providing a second display of a video application, in accordance with examples of the present disclosure. The foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in tent mode orientation. The first partition 125 and the fold region 135 of the flexible touchscreen element are visible, and the foldable touchscreen device 100 is shown executing a video player application. First and second application control overlays 2002, 2004 are displayed at the fold region 135, which, due to the tent mode orientation, is at the top of the foldable touchscreen device 100. The first application control overlay 2002 is a brightness control overlay and the second application control overlay 2004 is a volume control overlay. A finger of a user hand 912 is shown touching the fold region 135 at a location of the display of the second application control overlay 2004. Touch gestures may be detected in the fold region 135 to trigger application control actions corresponding to the application control overlays 2002, 2004 in a similar manner to that described above with respect to FIG. 19.
[00187] FIG. 21 is a flowchart of an example method 2100 for triggering a first action corresponding to a touch gesture and a location corresponding to a thumb input overlay, in accordance with examples of the present disclosure. The method 2100 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 1710 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00188] At the operation 2110, the system detects a grip posture of the foldable touchscreen device.
[00189] A grip posture may refer to a position of a user’s hands with respect to a partially folded touchscreen device 100 such that the user’s thumbs are positioned near the fold region 135 and one or more of the remaining fingers are beneath the device 100, as shown for example, in FIGs. 22 and 23.
[00190] A grip posture may be detected using one or more of a variety of methods, including capacitive touchscreen sensing with or without use of internal inertial measurement unit (IMU) sensors and/or a camera. For example, capacitive touchscreen sensing may detect a user’s thumbs near the fold region 135, and/or the palms of two hands at corresponding positions of the first partition 125 of the flexible touchscreen element 140. Additionally or alternatively, an internal IMU may detect both a partially folded posture of the touchscreen device 100, and the initialization and maintenance of the second partition 130 of the flexible touchscreen element 140 in a raised position.
[00191] At the operation 2120, the system displays, at the fold region, a thumb input overlay.
[00192] At the operation 2130, the system detects a touch gesture at a location of the fold region corresponding to the thumb input overlay. [00193] At the operation 2140, the system triggers a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
[00194] Reference is now made to FIG. 22, which illustrates a foldable touchscreen device 100 executing a video game application, in accordance with examples of the present disclosure. FIG. 22 illustrates an example implementation of the method 2100 of FIG. 21. The foldable touchscreen device has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in laptop orientation, in which the fold region 135 is oriented horizontally and the foldable touchscreen device 100 is semi -folded. The foldable touchscreen device 100 is shown providing a display of a currently executed video game application in the second partition 130. A thumb input overlay 2202 comprising virtual buttons are shown displayed near the bottom of the flexible touchscreen element 140, in the first partition 125. A user’s left hand 2204 and the user’s right hand 2206 are shown gripping the bottom of the foldable touchscreen device 100, such that the user’s left thumb 2208 and the user’s right thumb 2210 are position in proximity to the virtual buttons of the thumb input overlay 2202 displayed in the first partition 125.
[00195] When a foldable touchscreen device (e.g., a smartphone) is in laptop mode, the fold region may be easily reachable by a user’s thumbs, and may thus provide a natural input space for tapping and/or sliding. By using dynamic hinge shortcuts at the fold region, additional dimensions may be added to the game control in an intuitive way. For example, in some implementations, a slide gesture at the fold region may be used to zoom in, to zoom out, or to change weapons. Using dynamic hinge shortcuts at the fold region may provide a more immersive and intuitive gaming experience on a foldable touchscreen device, such as on a foldable smartphone.
[00196] In another example, the method 2100 of FIG. 21 may be performed when the foldable touchscreen device is executing a camera application. Reference is now made to FIG. 23, which illustrates a foldable touchscreen device 100, executing a camera application, in accordance with examples of the present disclosure. The foldable touchscreen device 100 is shown having a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in laptop orientation and displaying a viewport of a currently executed a camera application in the first partition 125. A thumb input overlay 2302 comprising first and second application control overlays 2304, 2306 are shown displayed at the fold region 135 of the flexible touchscreen element 140. In this example, the first application control overlay 2304 is an exposure control overlay and the second application control overlay 2306 is a zoom control overlay. Other user interface elements of the camera application may be displayed in the second partition 130. A user’s left hand 2204 and the user’s right hand 2206 are shown gripping the bottom of the foldable touchscreen device 100, such that the user’s left thumb 2208 and the user’s right thumb 2210 are position in proximity to the first and second application control overlays, respectively.
[00197] In this way, dynamic sliding control may be provided in the fold region 135 for camera application actions such as “zoom” and “exposure”, for example.
[00198] FIGs. 22 and 23 illustrate examples of how application-specific control overlays may be displayed in the fold region 135, in a manner that can be intuitively and conveniently accessible to a user who is holding the foldable touchscreen device 100 in a semifolded position (e.g., a laptop orientation). It should be understood that other implementation of the method 2100 when executing other applications may be possible, without being limited to gaming or camera applications.
[00199] FIG. 24 is a flowchart of an example method 2400 of modifying one of one or more user modifiable objects, in accordance with examples of the present disclosure. The method 2400 may be performed by one or more processors of a computing system (e.g., the computing system 200 of FIG. 2) operating as a foldable touchscreen device 100 (FIG. 1). Specifically, the operations 2410 and onward may be performed by one or more processors (e.g., processor 302 of FIG. 2) of the computing system.
[00200] At the operation 2410, the system displays one or more user modifiable obj ects.
[00201] At the operation 2420, the system receives a selection of one of the one or more user modifiable objects.
[00202] At the operation 2430, the system displays, at the fold region, one or more shortcut overlays corresponding to the one of the one or more user modifiable objects. [00203] At the operation 2440, the system modifies the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
[00204] FIGs. 25-26 illustrate an example implementation of the method 2400 of FIG. 24. FIG. 25 illustrates a foldable touchscreen device 100 providing a first display of a word processing application, in accordance with examples of the present disclosure. The foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown in full screen mode and in laptop orientation. The foldable touchscreen device 100 is shown executing a word processing application and displaying text. The text may include user modifiable objects such as a selected portion of text. A first section of text 2502 is shown as selected text, indicated by highlighting over the first section of text 2502. A finger of a user hand 912 is shown touching the flexible touchscreen element 140 at a location of the first section of text 2502.
[00205] FIG. 26 illustrates that, following selection of the first section of text 2502, one or more shortcut overlays 2602 are shown displayed at the fold region 135. The one or more shortcut overlays 2602 correspond to actions that may be applied to the selection of text 2502. A finger of a user hand 912 is shown touching the flexible touchscreen element 140 at a location of the one or more shortcut overlays 2602.
[00206] The provision of the one or more shortcut overlays at the fold region may provide the user with easy access to common text editing actions such as copy, paste, and formatting options (such as bold, italic, and underline) by simply selecting a section of text to activate the one or more shortcut overlays at the fold region. Once the overlay is activated, the user may simply choose one of the one or more shortcut overlays representing a desired action in order to adjust the text as needed. In this way, a user may quickly and efficiently edit text without having to navigate through multiple menus or use keyboard shortcuts. The one or more shortcut overlays may comprise the most relevant and useful shortcuts, making the editing process even more efficient and intuitive.
[00207] FIGs. 27-28 illustrate another example implementation of the method 2400 of FIG. 24. FIG. 27 illustrates a foldable touchscreen device 100 providing a first display of a design application, in accordance with examples of the present disclosure. The foldable touchscreen device 100 has a flexible touchscreen element 140 having a first partition 125, a second partition 130, and a fold region 135. The foldable touchscreen device 100 is shown executing a design application. A value input box 2702 is shown displayed at an upper right portion of the flexible touchscreen element 140. A finger of a user hand 912 is shown touching an element of the value input box 2702. In some embodiments, after the value input box 2702 is activated (e.g., selected by touch input), a dynamic slider may be displayed in the fold region 135, as will be described with reference to FIG. 28.
[00208] FIG. 28 illustrates that, following activation of the value input box 2702, a dynamic slider 2802 is displayed at the fold region 135. A finger of a user hand 912 is shown touching the dynamic slider 2802.
[00209] In some embodiments, the dynamic slider 2802 at the fold region 135 may only be activated when the value input box 2702 is selected, ensuring that the dynamic slider 2802 is only displayed in the fold region 135 when needed by a user. Other context-specific action shortcuts may be similarly displayed in the fold region 135, depending on current user selections and/or currently activated functions of the application. As a result, user interaction with a design application may be efficient and intuitive.
[00210] In this way, a user may easily adjust values in design applications by simple selecting a value input box 2702 and using a dynamic slider 2802 at the fold region 135. The effort of manually inputting values may be avoided, and quick adjustments may be made without navigating through menus and/or keyboard shortcuts.
[00211] As described herein, the fold region, or “hinge” of a foldable touchscreen device may be exploited to provide for additional user utility and enjoyment. Embodiments disclosed herein describe the provision, at the hinge, of a dynamic portal of shortcuts which can adapt to a users’ input and to system and/or application events. These shortcuts may reduce or eliminate the need to click multiple steps within multiple menus in order to perform commonly executed tasks. Embodiments described herein may facilitate Windows™ management and multitasking, and may be used in a variety of application settings, including presentation applications (such as Powerpoint™), e-reading applications, video applications, camera applications, video game applications, and text editing applications. [00212] The embodiments described herein may be implemented in any combination. A single device may be configured to implement any one or more or all embodiments in any combination.
[00213] Although the present disclosure describes methods and processes with steps in a certain order, one or more steps of the methods and processes may be omitted or altered as appropriate. One or more steps may take place in an order other than that in which they are described, as appropriate.
[00214] Although the present disclosure is described, at least in part, in terms of methods, a person of ordinary skill in the art will understand that the present disclosure is also directed to the various components for performing at least some of the aspects and features of the described methods, be it by way of hardware components, software or any combination of the two. Accordingly, the technical solution of the present disclosure may be embodied in the form of a software product. A suitable software product may be stored in a pre-recorded storage device or other similar non-volatile or non-transitory computer readable medium, including DVDs, CD-ROMs, USB flash disk, a removable hard disk, or other storage media, for example. The software product includes instructions tangibly stored thereon that enable a processing device (e.g., a personal computer, a server, or a network device) to execute examples of the methods disclosed herein.
[00215] The present disclosure may be embodied in other specific forms without departing from the subject matter of the claims. The described example embodiments are to be considered in all respects as being only illustrative and not restrictive. Selected features from one or more of the above-described embodiments may be combined to create alternative embodiments not explicitly described, features suitable for such combinations being understood within the scope of this disclosure.
[00216] All values and sub-ranges within disclosed ranges are also disclosed. Also, although the systems, devices and processes disclosed and shown herein may comprise a specific number of elements/components, the systems, devices and assembhes could be modified to include additional or fewer of such elements/components. For example, although any of the elements/components disclosed may be referenced as being singular, the embodiments disclosed herein could be modified to include a plurality of such elements/components. The subject matter described herein intends to cover and embrace all suitable changes in technology.

Claims

WHAT IS CLAIMED IS:
1. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display a first object in the fust partition; detect, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detect a path of the drag gesture passing near or through a first one of one or more shortcut overlays displayed in the fold region; and trigger a first action corresponding to the first one of the one or more shortcut overlays.
2. The foldable touchscreen device of claim 1, wherein the foldable touchscreen device is further caused to, prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays: detect, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and display, at the fold region, the one or more shortcut overlays.
3. The foldable touchscreen device of claims 1 or 2, wherein the foldable touchscreen device is further caused to: detect a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, remove the one or more shortcut overlays from display in the fold region.
4. The foldable touchscreen device of any one of claim 1 to 3, wherein the one or more shortcut overlays is a windows management overlay.
5. The foldable touchscreen device of any one of claims 1 to 3, wherein the one or more shortcut overlays is an app multiplier overlay.
6. The foldable touchscreen device of any one of claims 1 to 5, wherein the foldable touchscreen device is further caused to: detect the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and trigger a second action corresponding to the second one of the one or more shortcut overlays.
7. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of an e-reader application: detect a drag gesture in a fust direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and insert a virtual bookmark at a currently displayed location of an electronic document.
8. The foldable touchscreen device of claim 7, wherein the drag gesture begins at or near a first edge of the fold region.
9. The foldable touchscreen device of claim 7 or claim 8, wherein the foldable touchscreen device is further caused to: detect a drag gesture in a second direction along the fold region substantially opposite to the first direction; and remove the virtual bookmark.
10. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display content using full screen mode; detect a drag gesture in a fust direction along the fold region, the first direction being substantially parallel to a longitudinal axis of the fold region; and display the content using dual screen mode.
11. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a first touch gesture at the fold region; and display a defined region containing application shortcuts.
12. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to, during execution of a particular application: display, at the fold region, one or more application control overlays defined by the particular application; detect a touch gesture at a location of the fold region corresponding to one of the one or more application control overlays; and trigger a first action corresponding to the one of the one or more application control overlays.
13. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: detect a grip posture of the foldable touchscreen device; in response to detecting the grip posture, display, at the fold region, a thumb input overlay; detect a touch gesture at a location of the fold region corresponding to the thumb input overlay; and trigger a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
14. The foldable touchscreen device of claim 13, wherein the foldable touchscreen device is executing a video game application and wherein the first action is a video game control action.
15. The foldable touchscreen device of claim 13, wherein the foldable touchscreen device is executing a camera application and wherein the first action is a camera control action.
16. A foldable touchscreen device comprising: a flexible touchscreen element having a first partition, a second partition, and a fold region separating the first and second partitions, the touchscreen element being foldable at the fold region; a processor; a memory storing instructions which, when executed by the processor, cause the foldable touchscreen device to: display one or more user modifiable objects; receive a selection of one of the one or more user modifiable objects; display, at the fold region, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modify the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
17. A computer-implemented method comprising: displaying a first object in a first partition of a flexible touchscreen element; detecting, at a location of the first partition corresponding to the first object, an initialization of a drag gesture; detecting a path of the drag gesture passing near or through a fold region of the flexible touchscreen element a first one of one or more shortcut overlays displayed in the fold region; and triggering a first action corresponding to the first one of the one or more shortcut overlays.
18. The method of claim 17, the method further comprising: prior to detecting a path of the drag gesture passing near or through a first one of one or more shortcut overlays, detecting, at a at a location of the first partition within a proximity of the fold region, the drag gesture; and displaying, at the fold region, the one or more shortcut overlays.
19. The method of claim 17 or 18, the method further comprising: detecting a cessation of the drag gesture; and in response to detecting the cessation of the drag gesture, removing the one or more shortcut overlays from display in the fold region.
20. The method of any one of claims 17 to 19, wherein the one or more shortcut overlays is a windows management overlay.
21. The method of any one of claims 17 to 19, wherein the one or more shortcut overlays is an app multiplier overlay.
22. The method of any one of claims 17 to 21, the method further comprising: detecting the drag gesture crossing the fold region at a second one of the one or more shortcut overlays; and triggering a second action corresponding to the second one of the one or more shortcut overlays.
23. A computer-implemented method comprising: detecting a drag gesture in a first direction along a fold region of a flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and inserting a virtual bookmark at a currently displayed location of an electronic document.
24. The method of claim 23, wherein the drag gesture begins at or near a first edge of the fold region.
25. The method of claim 23 or claim 24, the method further comprising: detecting a drag gesture in a second direction along the fold region substantially opposite to the first direction; and removing the virtual bookmark.
26. A computer-implemented method comprising: displaying content on a flexible touchscreen element using full screen mode; detecting a drag gesture in a fust direction along a fold region of the flexible touchscreen element, the first direction being substantially parallel to a longitudinal axis of the fold region; and displaying the content using dual screen mode.
27. A computer-implemented method comprising: detecting a first touch gesture at a fold region of a flexible touchscreen element; and displaying a defined region containing application shortcuts.
28. A computer-implemented method comprising: displaying, at a fold region of a flexible touchscreen element, one or more application control overlays defined by an application being executed; detecting a touch gesture at a location of the fold region corresponding to one of the one or more apphcation control overlays; and triggering a fust action corresponding to the one of the one or more application control overlays.
29. A computer-implemented method comprising: detecting a grip posture of a foldable touchscreen device, the foldable touchscreen device has a flexible touchscreen element; in response to detecting the grip posture, displaying, at a fold region of the flexible touchscreen element, a thumb input overlay; detecting a touch gesture at a location of the fold region corresponding to the thumb input overlay; and triggering a first action corresponding to the touch gesture and the location corresponding to the thumb input overlay.
30. The method of claim 29, wherein the foldable touchscreen device is executing a video game application and wherein the first action is a video game control action.
31. The method of claim 29, wherein the foldable touchscreen device is executing a camera application and wherein the first action is a camera control action.
32. A computer-implemented method comprising: displaying one or more user modifiable objects on a flexible touchscreen element; receiving a selection of one of the one or more user modifiable objects; displaying, at a fold region of the flexible touchscreen element, one or more shortcut overlays corresponding to an action for modifying the one of the one or more user modifiable objects; and in response to selection of one of the one or more shortcut overlays, modifying the one of the one or more user modifiable objects according to the action corresponding to the selected one of the one or more shortcut overlays.
PCT/CA2023/050465 2023-04-05 2023-04-05 Foldable device hinge interaction WO2024207090A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CA2023/050465 WO2024207090A1 (en) 2023-04-05 2023-04-05 Foldable device hinge interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA2023/050465 WO2024207090A1 (en) 2023-04-05 2023-04-05 Foldable device hinge interaction

Publications (1)

Publication Number Publication Date
WO2024207090A1 true WO2024207090A1 (en) 2024-10-10

Family

ID=92970608

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2023/050465 WO2024207090A1 (en) 2023-04-05 2023-04-05 Foldable device hinge interaction

Country Status (1)

Country Link
WO (1) WO2024207090A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233517A1 (en) * 2019-05-17 2020-11-26 华为技术有限公司 Application display method and electronic device
WO2020259461A1 (en) * 2019-06-25 2020-12-30 华为技术有限公司 Display method and electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233517A1 (en) * 2019-05-17 2020-11-26 华为技术有限公司 Application display method and electronic device
WO2020259461A1 (en) * 2019-06-25 2020-12-30 华为技术有限公司 Display method and electronic device

Similar Documents

Publication Publication Date Title
US12073067B2 (en) User interface transitions and optimizations for foldable computing devices
EP4038498B1 (en) User interface adaptations based on inferred content occlusion and user intent
US10152948B2 (en) Information display apparatus having at least two touch screens and information display method thereof
US11120203B2 (en) Editing annotations of paginated digital content
US9495094B2 (en) Method and apparatus for inputting user commands using relative movements of device panels
US9424241B2 (en) Annotation mode including multiple note types for paginated digital content
US9792272B2 (en) Deleting annotations of paginated digital content
US10915698B2 (en) Multi-purpose tool for interacting with paginated digital content
CN103329062A (en) Multi-screen user interface with orientation-based control
US11221759B2 (en) Transitions and optimizations for a foldable computing device operating in a productivity mode
US12105950B2 (en) Extensions to global keyboard shortcuts for computing devices having multiple display regions
CN114556281B (en) Predictive gesture optimization for moving objects across display boundaries
US20210405695A1 (en) Conditional windowing model for foldable computing devices
WO2024207090A1 (en) Foldable device hinge interaction
CN115469776A (en) Program icon display method and device, electronic equipment and storage medium
KR20150098366A (en) Control method of virtual touchpadand terminal performing the same
KR101692848B1 (en) Control method of virtual touchpad using hovering and terminal performing the same
KR20160107139A (en) Control method of virtual touchpadand terminal performing the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23931205

Country of ref document: EP

Kind code of ref document: A1