CN117908746A - Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input - Google Patents

Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input Download PDF

Info

Publication number
CN117908746A
CN117908746A CN202310095484.1A CN202310095484A CN117908746A CN 117908746 A CN117908746 A CN 117908746A CN 202310095484 A CN202310095484 A CN 202310095484A CN 117908746 A CN117908746 A CN 117908746A
Authority
CN
China
Prior art keywords
flexible display
blade assembly
electronic device
processors
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310095484.1A
Other languages
Chinese (zh)
Inventor
阿米特·库马尔·阿格拉沃尔
罗希特·西索迪亚
陈臣
吴茂源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motorola Mobility LLC
Original Assignee
Motorola Mobility LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Mobility LLC filed Critical Motorola Mobility LLC
Priority to US18/114,663 priority Critical patent/US20240129392A1/en
Publication of CN117908746A publication Critical patent/CN117908746A/en
Pending legal-status Critical Current

Links

Abstract

The present invention relates to a method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input. A method in an electronic device comprising: user input defining a swipe gesture is detected with a flexible display carried by a blade assembly slidably coupled to the device housing and movable between at least an extended position and a retracted position. The method then includes translating, by a translation mechanism operable with the blade assembly, the blade assembly toward the extended position in response to the sweeping gesture.

Description

Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input
Cross reference to the prior application
The present application claims priority and benefit from the following U.S. provisional applications in accordance with 35.S. c ≡119 (e): U.S. Ser. No. 63/416,927 submitted at month 17 of 2022 and U.S. Ser. No. 63/419,994 submitted at month 27 of 2022, each of which is incorporated by reference for all purposes.
Technical Field
The present disclosure relates generally to electronic devices, and more particularly to electronic devices having flexible displays.
Background
Portable electronic communication devices, particularly smart phones, have become ubiquitous. Such devices are used by people around the world to maintain connectivity. These devices have been designed in various mechanical configurations. The first configuration, known as a "bar-type", is generally rectangular in shape, has a rigid form factor, and has a display disposed along a major face of the electronic device. In contrast, "flip" devices have a mechanical hinge that allows one housing to pivot relative to the other housing. A third type of electronic device is a "slider" in which two different device housings slide, one of which slides relative to the other.
Some consumers prefer bar-type devices, while others prefer flip-type devices. Still others prefer sliding covers. The latter two types of devices are convenient because they are smaller in the closed position than in the open position, and thus easier to fit in the pocket. While flip and slider devices are mechanically relatively straight forward, they may still tend to be bulky when in the closed position due to the fact that two device housings are required. Accordingly, an improved electronic device is desired that not only provides a compact geometry factor, but also allows for the use of a larger display surface area.
Drawings
Fig. 1 illustrates one illustrative electronic device in accordance with one or more embodiments of the present disclosure.
Fig. 2 illustrates one illustrative electronic device having a translating display that moves to a first sliding position in which a portion of the translating display extends distally away from a device housing of the electronic device.
Fig. 3 illustrates the illustrative electronic device of fig. 2 with the translating display moved to a second sliding position in which the translating display surrounds and abuts a device housing of the electronic device.
Fig. 4 illustrates the electronic device of fig. 3 from the rear.
Fig. 5 illustrates the illustrative electronic device of fig. 2 having a translating display moved to a third sliding position, referred to as a "peeking" position, that exposes an image capture device positioned below the translating display when the translating display is in either the first sliding position or the second sliding position.
Fig. 6 illustrates one or more interpreted physical sensors suitable for use alone or in combination in an electronic device in accordance with one or more embodiments of the present disclosure.
Fig. 7 illustrates one or more explanatory context sensors suitable for use alone or in combination in an electronic device in accordance with one or more embodiments of the present disclosure.
Fig. 8 illustrates a portion of one illustrative display assembly in an exploded view in accordance with one or more embodiments of the present disclosure.
Fig. 9 illustrates a portion of one illustrative display assembly in an exploded view in accordance with one or more embodiments of the present disclosure.
Fig. 10 illustrates one illustrative display component in an exploded view in accordance with one or more embodiments of the present disclosure.
FIG. 11 illustrates an illustrative display component in accordance with one or more embodiments of the present disclosure.
Fig. 12 illustrates one illustrative display assembly in an undeformed state.
Fig. 13 illustrates the illustrative display assembly of fig. 12 in a deformed state.
Fig. 14 illustrates the illustrative display assembly of fig. 12 in another deformed state, with an exploded view of a deformable portion of the display assembly shown in an enlarged view.
FIG. 15 illustrates a front view of one illustrative electronic device having a blade assembly in an extended position in accordance with one or more embodiments of the present disclosure.
FIG. 16 illustrates a left side view of one illustrative electronic device having a blade assembly in an extended position in accordance with one or more embodiments of the present disclosure.
FIG. 17 illustrates a rear view of an illustrative electronic device having a blade assembly in an extended position in accordance with one or more embodiments of the present disclosure.
FIG. 18 illustrates a front view of one illustrative electronic device having a blade assembly in a retracted position in accordance with one or more embodiments of the present disclosure.
FIG. 19 illustrates a left side view of one illustrative electronic device having a blade assembly in a retracted position in accordance with one or more embodiments of the present disclosure.
FIG. 20 illustrates a rear view of an illustrative electronic device having a blade assembly in a retracted position in accordance with one or more embodiments of the present disclosure.
Fig. 21 illustrates a front view of one illustrative electronic device having a blade assembly in a peeping position exposing a front image capture device in accordance with one or more embodiments of the present disclosure.
Fig. 22 illustrates a rear view of one illustrative electronic device having a blade assembly in a peeping position exposing a front image capture device in accordance with one or more embodiments of the present disclosure.
Fig. 23 illustrates an illustrative method in accordance with one or more embodiments of the present disclosure.
Fig. 24 illustrates another illustrative method in accordance with one or more embodiments of the present disclosure.
Fig. 25 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 26 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 27 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 28 illustrates one or more embodiments of the present disclosure.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve the understanding of the embodiments of the present disclosure.
Detailed Description
Before describing in detail embodiments that are in accordance with the present disclosure, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to translating a flexible display between an extended position and a retracted position in response to a user input, the user input in one or more embodiments being a swipe gesture (swipe gesture). Any process descriptions or blocks in flowcharts should be understood as representing modules, segments, or portions of code in which one or more executable instructions for implementing specific logical functions or steps in the process are included.
Alternative implementations are also included, and it will be apparent that, depending on the function involved, the order of execution of the functions shown or discussed may be different, including substantially simultaneous or in reverse order. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic cost, when guided by the concepts and principles disclosed herein will be readily capable of generating the methods and apparatus with minimal experimentation.
Embodiments of the present disclosure will now be described in detail. Referring to the drawings, like numbers indicate like parts throughout the views. As used in the description herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of "a," "an," and "the" includes plural references, "in the sense of" in "and" on.
Relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. As used herein, components may be "operatively coupled" when information may be sent between the components, even though there may be one or more intervening or intermediate components between the components or along the connection path.
The terms "substantially," "approximately," "about," or any other form thereof are defined as being proximate, as understood by one of ordinary skill in the art, and in a non-limiting embodiment, the terms are defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5 percent. The term coupled, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. Also, reference designators shown herein in brackets indicate components shown in the drawings, not in discussion. For example, discussing device (10) while discussing fig. a refers to element 10 shown in the figures rather than fig. a.
Embodiments of the present disclosure provide an electronic device that includes a single device housing. In one or more embodiments, the flexible display is then incorporated into a "blade assembly" component that surrounds the single device housing. In one or more embodiments, the blade assembly accomplishes this by being coupled to a translation mechanism attached to a single device housing.
The translation mechanism is operable to transition the blade assembly around a surface of the device housing between an extended position in which the blade of the blade assembly extends distally from the device housing, a retracted position in which the blade assembly abuts the device housing with the flexible display surrounding the surface of the device housing, a "peep" position in which the blade assembly reveals an image capture device positioned beneath the blade assembly at a front portion of the single device housing, and a position therebetween.
By way of illustration, in one illustrative embodiment, the blade assembly slides about a single device housing such that the blade slides away from the single device housing to change the overall length of the flexible display that appears in front of the electronic device. In other embodiments, the blade assembly may be slid around a single device housing in opposite directions to a retracted position, wherein a similar amount of flexible display is visible on the front side of the electronic device and the back side of the electronic device. Thus, in one or more embodiments, an electronic device includes a single device housing having a blade assembly coupled to both major surfaces of the single device housing and surrounding at least one minor surface of the electronic device, at which the translating mechanism is positioned such that the blade assembly is capable of sliding around the single device housing and relative to the single device housing between a retracted position, an extended position, and a peeping position exposing a front image capture device.
In one or more embodiments, a flexible display is coupled to the blade assembly. In one or more embodiments, the flexible display is also surrounded by a silicone border that is co-molded onto the blade substrate and protects the side edges of the flexible display. In one or more embodiments, the vane assembly engages at least one rotor of a translation mechanism located at an end of a single equipment housing. When a translation mechanism located in the single device housing drives an element coupled to the blade assembly, the flexible display encircles the rotor and moves to extend the blade of the blade assembly further from or back toward the single device housing.
In one or more embodiments, one end of the flexible display is fixedly coupled to the blade assembly. Meanwhile, the other end of the flexible display is coupled to the tensioner via a flexible substrate that extends beyond the terminal edge of the flexible display. In one or more embodiments, the flexible substrate is a stainless steel substrate, although other materials may be used.
By way of illustration, in one or more embodiments, the flexible substrate of the flexible display is longer in at least one dimension along its long axis than the flexible display. Thus, at least a first end of the flexible substrate extends distally beyond at least one terminal end of the flexible display. This allows the first end of the flexible substrate to be rigidly coupled to the tensioner. In one or more embodiments, an adhesive is used to couple one end of the flexible display to the blade assembly, and one or more fasteners are used to couple a second end of the flexible display to a tensioner carried by the blade assembly.
In one or more embodiments, the translation mechanism includes an actuator that causes a portion of the blade assembly adjacent the first major surface of the single device housing and another portion of the blade assembly adjacent the second major surface of the single device housing to symmetrically slide in opposite directions along the single device housing as the blade assembly transitions between the extended position, the retracted position, and the peeping position.
In one or more embodiments, translation of the blade assembly occurs in response to user input received at the flexible display. By way of illustration, in one or more embodiments, the flexible display detects user input defining a swipe gesture. In one or more embodiments, when this occurs, the translating mechanism translates the blade assembly toward the extended position.
In one or more embodiments, after translation of the blade assembly, the one or more processors of the electronic device may present new content on the newly exposed front portion of the flexible display. By way of illustration, in one or more embodiments, a user may communicate a swipe gesture to a flexible display to cause a translating mechanism to translate a blade assembly toward an extended position. Thereafter, the one or more processors may present one or more user-actuated targets associated with the one or more applications in a configuration of the application tray in the newly exposed front portion of the flexible display.
In contrast, if the user communicates a swipe gesture when the application portal is open, the one or more processors may scroll through the content or perform other operations. In one or more embodiments, when the reverse sweeping gesture is transferred to the flexible display, the translating mechanism may translate the blade assembly toward the retracted position, wherein the one or more processors remove content from a front portion of the flexible display that becomes hidden or facing rearward due to the translation.
Effectively, embodiments of the present disclosure move a translation of a flexible display carried by a blade assembly slidably coupled to a device housing and movable between at least an extended position and a retracted position toward the extended position, or alternatively toward the retracted position, in response to a swipe gesture. In one or more embodiments, an electronic device includes: an equipment housing; a blade assembly slidably coupled to the device housing carrying the blade and the flexible display; a translation mechanism operable to slide the blade assembly relative to the device housing between an extended position, a retracted position, and optionally a peeping position; and one or more processors operable with the translation mechanism. In one or more embodiments, in response to the flexible display detecting the sweeping gesture, the one or more processors cause the translating mechanism to translate the blade assembly toward the extended position and present additional content on a front portion of the flexible display that is revealed by the translation of the flexible display.
In one or more embodiments, one or more processors can determine a foreground activity occurring on an electronic device. The one or more processors may also determine whether the electronic device is in a locked or unlocked mode of operation.
To present an application tray on a home screen in response to a swipe gesture, one or more processors may determine that the electronic device is in an unlocked operational state. In one or more embodiments, the flexible display then receives user input defining a swipe gesture while the one or more processors present a home screen presentation on the flexible display.
In one or more embodiments, in response to detecting a swipe gesture that sweeps upward, the one or more processors cause the translating mechanism to translate the blade assembly toward the extended position. In one or more embodiments, the translation is stopped when the blade assembly is in a position closest to the retracted position, such that all applications of the application tray may be presented. In one or more embodiments, when additional swipe gestures are received, the one or more processors may check whether additional applications are present. The one or more processors may cause the translation mechanism to translate the blade assembly again toward the extended position and render the remaining applications on the newly revealed front portion of the flexible display, if present. Otherwise, in one or more embodiments, the one or more processors cause the translating mechanism to translate the blade assembly back toward the retracted position when additional gesture input is received.
In one or more embodiments, when an application is active on a front portion of a flexible display, one or more processors first detect this fact and the fact that the electronic device is in an unlocked operational state. In one or more embodiments, in response to detecting a swipe gesture that sweeps upward, the one or more processors cause the translating mechanism to translate the blade assembly toward the extended position. In one or more embodiments, the translation is stopped when the blade assembly is in a position closest to the retracted position such that current activity associated with the application may be presented on the flexible display. Since the space required by the application for a particular activity may vary, in one or more embodiments, the amount of area of the front portion may be varied by continuing to move the blade assembly to provide sufficient space for application content on the front portion of the flexible display.
In one or more embodiments, a method in an electronic device includes detecting a swipe gesture with a flexible display carried by a blade assembly slidably coupled to a device housing and movable between an extended position, a retracted position, and a peeping position. In one or more embodiments, the method includes translating the blade assembly toward the extended position with a translation mechanism in response to the swipe gesture.
In one or more embodiments, such translation occurs only when the electronic device is in an unlocked state when a swipe gesture is detected. In one or more embodiments, the one or more processors of the electronic device may then present additional content on the front portion of the flexible display that is revealed when the blade assembly translates toward the extended position.
Advantageously, embodiments of the present disclosure provide an improved sliding mechanism for a flexible display integrated into a blade assembly in a sliding electronic device having a single device housing that slides in response to touch input and in particular swipe gestures. In one or more embodiments, a swipe gesture toward the top of the flexible display translates the blade assembly toward the extended position, while a swipe gesture toward the bottom of the flexible display translates the blade assembly toward the retracted position.
The actuator of the translation mechanism may take a variety of forms. In some embodiments, the actuator may comprise a dual-axis motor. In one or more embodiments, the dual-axis motor may be threaded to move the translator of the translation mechanism in equal and opposite directions. In other embodiments, the biaxial motor may be coupled to at least one synchronous belt.
In another embodiment, the actuator includes a first drive screw and a second drive screw. The drive screws may be coupled together by a gear assembly. When a first portion of the vane assembly is coupled to a translator positioned about the first drive screw and a second portion of the vane assembly is coupled to another translator positioned about the second drive screw, actuation of either causes a first portion of the vane assembly adjacent the first major surface of the single device housing and a second portion of the vane assembly adjacent the second major surface of the single device housing to move symmetrically in opposite directions as the first and second drive screws rotate.
In yet other embodiments, the actuator includes a first rack, a second rack, and a pinion. The first rack may be coupled to a first portion of the vane assembly and the second rack may be coupled to a second portion of the vane assembly. When the pinion engages both the first rack or the second rack, actuation of either causes a first portion of the vane assembly adjacent the first major surface of the single device housing and a second portion of the vane assembly adjacent the second major surface of the single device housing to do the same as the first rack and the second rack move symmetrically in opposite directions. Other structures of the actuator will be described below. Still other configurations will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the blade assembly is coupled to a translator of the translation mechanism. When the translator is actuated, a first portion of the blade assembly adjacent the first major surface of the single equipment housing and a second portion of the blade assembly adjacent the second major surface of the single equipment housing move symmetrically in opposite directions.
Advantageously, embodiments of the present disclosure provide an improved sliding mechanism for a flexible display in an electronic device. A flexible display and rotor sliding assembly configured in accordance with embodiments of the present disclosure maintains a flat upper portion of the J-shape defined by the flexible display and/or the blade assembly while maintaining operability and functionality of the flexible display during a sliding operation.
Embodiments of the present disclosure contemplate that in such electronic devices having a translating display, a user must typically manually select whether the display is transitioned to the extended position, the retracted position, or the peeping position. By way of illustration, a user may have to press a button once to transition the translating display to the extended position and twice to transition the translating display to the retracted position. A "long press" button may be required to transition the translating display to a peeping position, etc.
Such manual actuation requires the user to take manual action to change the state of the electronic device. Additionally, this requirement potentially delays the usability of the electronic device in a new state due to the time it takes to manually "inject" a trigger that causes the transition of the display to be translated by pressing a button.
Advantageously, embodiments of the present disclosure provide systems and methods for automatically and preemptively moving a translating display to an optimal state based on sensed swipe gestures. By way of illustration, in one or more embodiments, when one or more sensors of the electronic device detect an upward swipe gesture while a front application operating on one or more processors enters full-screen immersion mode, the one or more processors of the electronic device may transition the translating display to an extended position. Examples of applications that utilize this full screen immersion mode of operation include gaming applications and video playback applications. Other such applications will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the artificial intelligence classifier can be used to determine an optimal display state and generate a trigger for that state based on particular user preferences identified from the operating context when a swipe gesture is received. In one or more embodiments, the artificial intelligence model is trained using the following inputs entered as weighted variables: a current foreground application, device orientation in three-dimensional space, a type of application operating on one or more processors, e.g., an application is a gaming application, a video productivity application, a media application, etc.), an application display mode, e.g., whether the display is used in an immersive or non-immersive mode, and when a user communicates a swipe gesture (and in which direction) to cause the translating display to transition to an extended position or a retracted position.
In one or more embodiments, the artificial intelligence classifier can continuously learn user preferences for extended positions based on user actions. In one or more embodiments, the artificial intelligence classifier may automatically trigger movement of the translation display to the extended position in response to a swipe gesture activity.
Translation of the translation display to the retracted position may occur in a similar manner. In one or more embodiments, the one or more processors of the electronic device can automatically translate the translation display back into the return position upon receiving a downward swipe gesture. Advantageously, embodiments of the present disclosure provide for intuitive operation of a translation display in an electronic device. In the case where an automatic translation of the translation display is triggered, the only user action required to change position of the translation display is a simple swipe gesture. Thereafter, the device automatically changes to the location potentially desired by the user. Other advantages will be described below. Still other advantages will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Turning now to fig. 1, illustrated therein is one illustrative electronic device 100 configured in accordance with one or more embodiments of the present disclosure. The electronic device 100 of fig. 1 is a portable electronic device. For illustration purposes, the electronic device 100 is shown as a smart phone. However, the electronic device 100 may be any number of other devices including a tablet computer, gaming device, multimedia player, etc. Still other types of electronic devices may also be configured in accordance with one or more embodiments of the present disclosure, as would be readily understood by one of ordinary skill in the art having the benefit of this disclosure.
The electronic device 100 includes a single device housing 101. In one or more embodiments, the blade assembly 102 carrying the flexible display 104 surrounds a single device housing 101. As will be described in more detail below, in one or more embodiments, the blade assembly 102 is configured to "slide" along a first major surface of a single device housing 101 (covered by a flexible display in a front view of the electronic device 100 on the left side of fig. 1) and a second major surface 103 located on the rear side of the single device housing 101.
In one or more embodiments, the single device housing 101 is made of a rigid material, such as a rigid thermoplastic, metal, or composite material, although other materials may be used. By way of illustration, in one illustrative embodiment, a single device housing 101 is made of aluminum. Still other constructs will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In the illustrative embodiment of fig. 1, the blade assembly 102 carries a flexible display 104. The flexible display 104 may optionally be touch sensitive. The user may communicate user input to the flexible display 104 of such an embodiment by communicating touch input from a finger, stylus, or other object disposed proximate to the flexible display 104.
In one embodiment, the flexible display 104 is configured as an Organic Light Emitting Diode (OLED) display fabricated on a flexible plastic substrate. The blade assembly 102 is also fabricated on a flexible substrate. This allows the blade assembly 102 and the flexible display 104 to deform around the display roller mechanism (display roller mechanism) 105 when the first portion 106 of the blade assembly 102 abutting the first major surface of the single device housing 101 and the second portion 107 of the blade assembly 102 abutting the second major surface 103 of the single device housing 101 are symmetrically moved around the single device housing 101 in opposite directions. In one or more embodiments, both the blade assembly 102 and the flexible display 104 are constructed on flexible metal substrates that may each allow bending at various bending radii around the display roller mechanism 105.
In one or more embodiments, the flexible display 104 may be formed from multiple layers of flexible material, such as a flexible sheet of polymer or other material. In the illustrative embodiment, flexible display 104 is fixedly coupled to blade assembly 102, which surrounds display roller mechanism 105.
Features may be incorporated into a single device housing 101. Examples of such features include one or more cameras or image capture devices 108 or optional speaker ports. In this illustrative embodiment, the user interface components 109, 110, 111, which may be buttons, fingerprint sensors, or touch sensitive surfaces, may also be disposed along a surface of the single device housing 101. Any of these features are shown as being disposed on a side surface of the electronic device 100, and may be located elsewhere. In other embodiments, these features may be omitted.
A block diagram schematic 112 of the electronic device 100 is also shown in fig. 1. The block diagram schematic 112 includes one or more electronic components that may be coupled to printed circuit board components disposed within the single device housing 101. Alternatively, the electronic components may be carried by the blade assembly 102. By way of illustration, in one or more embodiments, the electronic components may be positioned under a "backpack" 113 carried by the blade assembly 102.
The components of the block diagram 112 may be electrically coupled together by conductors or buses disposed along one or more printed circuit boards. For example, some components of the block diagram 112 may be configured as first electronic circuitry fixedly located within the single device housing 101, while other components of the block diagram 112 may be configured as second electronic circuitry carried by the blade assembly 102 in the backpack 113. The flexible substrate may then extend from the first electronic circuit in the single device housing 101 to the second electronic circuit carried by the blade assembly 102 in the backpack 113 to electrically couple the first electronic circuit to the second electronic circuit.
The illustrative block diagram 112 of fig. 1 includes many different components. Embodiments of the present disclosure contemplate that the number and arrangement of such components may vary depending on the particular application. Thus, an electronic device configured according to embodiments of the present disclosure may include some components not shown in fig. 1, and other components shown may not be required and thus may be omitted.
In one or more embodiments, the electronic device 100 includes one or more processors 114. In one embodiment, the one or more processors 114 may include an application processor, and optionally one or more auxiliary processors. One or both of the application processor or the auxiliary processor may include one or more processors. One or both of the application processor or the auxiliary processor may be a microprocessor, a set of processing elements, one or more ASICs, programmable logic, or other types of processing devices.
The application processor and the auxiliary processor may operate with various components of the electronic device 100. Each of the application processor and the auxiliary processor may be configured to process and execute executable software code to perform various functions of the electronic device 100. A storage device, such as memory 115, may optionally store executable software code used by the one or more processors 114 during operation.
In one embodiment, the one or more processors 114 are responsible for running the operating system environment of the electronic device 100. The operating system environment may include a kernel and one or more drivers, as well as an application service layer and an application layer. The operating system environment may be configured as executable code that operates on one or more processors or control circuits of the electronic device 100. The application layer may be responsible for executing application service modules. An application service module may support one or more applications or "apps. An application of the application layer may be configured as a client of the application service layer to communicate with a service through an Application Program Interface (API), message, event, or other inter-process communication interface. Where auxiliary processors are used, they may be used to perform input/output functions, actuate user feedback devices, etc.
In the illustrative embodiment, the electronic device 100 also includes a communication device 116 that may be configured for wired or wireless communication with one or more other devices or networks. The network may include a wide area network, a local area network, and/or a personal area network. The communication device 116 may also communicate using wireless technology such as, but not limited to, peer-to-peer or ad hoc communications such as HomeRF, bluetooth, and IEEE802.11, as well as other forms of wireless communication such as infrared technology. The communication device 116 may include: a wireless communication circuit; one of a receiver, transmitter or transceiver; and one or more antennas 117.
In one embodiment, one or more processors 114 may be responsible for executing the primary functions of electronic device 100. For example, in one embodiment, the one or more processors 114 include one or more circuits operable with one or more user interface devices, which may include the flexible display 104, to present images, video, or other presentation information to a user. Executable software code used by the one or more processors 114 may be configured as one or more modules 118 operable with the one or more processors 114. Such modules 118 may store instructions, control algorithms, logic steps, and the like.
In one embodiment, the one or more processors 114 are responsible for running the operating system environment of the electronic device 100. The operating system environment may include a kernel and one or more drivers, as well as an application service layer and an application layer. The operating system environment may be configured as executable code that operates on one or more processors or control circuits of the electronic device 100. The application layer may be responsible for executing application service modules. An application service module may support one or more applications or "apps. An application of the application layer may be configured as a client of the application service layer to communicate with a service through an Application Program Interface (API), message, event, or other inter-process communication interface. Where auxiliary processors are used, they may be used to perform input/output functions, actuate user feedback devices, etc.
In one embodiment, the one or more processors 114 may generate commands or perform control operations based on information received from various sensors of the electronic device 100. As shown in fig. 1, these sensors may be classified into physical sensors 120 and context sensors 121.
In general, physical sensor 120 includes a sensor configured to sense or determine a physical parameter indicative of a condition in the environment surrounding electronic device 100. By way of example, the physical sensor 120 may include a device for determining information such as motion, acceleration, orientation, proximity to people and other objects, lighting, capturing images, and the like. Physical sensor 120 may include various combinations of microphones, location detectors, temperature sensors, barometers, proximity sensor components, proximity detector components, health sensors, touch sensors, cameras, audio capturing devices, and the like. Many examples of physical sensors 120 are described below with reference to fig. 6. Other physical sensors will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In contrast, the context sensor 121 does not measure a physical condition or parameter. Rather, they infer context from the data of the electronic device. By way of illustration, when the physical sensor 120 comprises a camera or smart imager, the context sensor 121 may use the data captured in the image to infer the context cues. The emotion detector is operable to analyze data from the captured image to determine an emotional state. The emotion detector may recognize facial gestures, such as smiles or raised eyebrows, to infer silently conveyed emotional states of a person, e.g., fun, anger, frustration, etc. Other context sensors 121 may analyze other data to infer context, including calendar events, user profiles, device operating states, energy storage within the battery, application data, data from third parties such as web services and social media servers, alerts, time of day, user repeated behavior, and other factors.
The context sensor 121 may be configured as a hardware component or alternatively as a combination of hardware and software components. The context sensor 121 may be configured to collect and analyze non-physical parameter data.
Examples of physical sensors 120 and context sensors 121 are shown in fig. 6 and 7. These examples are merely illustrative, as other physical sensors 120 and context sensors 121 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Turning briefly to fig. 6, various examples of physical sensors 120 are illustrated. In one or more embodiments, physical sensor 120 senses or determines a physical parameter indicative of a condition in the environment surrounding the electronic device. Fig. 6 illustrates several examples of physical sensors 120. It should be noted that those shown in fig. 6 are not comprehensive, as other physical sensors will be apparent to those of ordinary skill in the art having the benefit of this disclosure. Additionally, it should be noted that the various physical sensors 120 shown in FIG. 6 may be used alone or in combination. Thus, many electronic devices will employ only a subset of the physical sensors 120 shown in FIG. 6, with the particular subset selected being defined by the device application.
A first example of a physical sensor is a touch sensor 601. Touch sensor 601 may include a capacitive touch sensor, an infrared touch sensor, a resistive touch sensor, or another touch sensitive technology. Capacitive touch sensitive devices include a plurality of capacitive sensors, such as electrodes, disposed along a substrate. Each capacitive sensor in combination with associated control circuitry, such as one or more processors (114), is configured to detect objects that are in close proximity to, or touch, the surface of the display or the housing of the electronic device by establishing electric field lines between pairs of capacitive sensors, and then detecting disturbances of those field lines.
The electric field lines may be established from periodic waveforms such as square waves, sine waves, triangular waves, or other periodic waveforms that are emitted by one sensor and detected by another sensor. The capacitive sensor may be formed, for example, by disposing indium tin oxide patterned as electrodes on a substrate. Indium tin oxide can be used in such a system because it is transparent and conductive. Furthermore, it can be deposited as a thin layer by a printing process. Capacitive sensors may also be deposited on the substrate by electron beam evaporation, physical vapor deposition, or other various sputter deposition techniques.
Another example of a physical sensor 120 is a geolocation that serves as a location detector 602. In one embodiment, the location detector 602 is operable to determine location data when capturing images from a constellation of one or more earth-orbiting satellites or from a network of terrestrial base stations to determine an approximate location. Examples of satellite positioning systems suitable for use with embodiments of the invention include, among other examples, the navigation system with time and range (NAVSTAR) Global Positioning System (GPS) in the united states, and other similar satellite positioning systems. The location detector 602 may make location determinations autonomously or by means of a terrestrial base station, such as a base station associated with a cellular communication network or other ground-based network, or as part of a Differential Global Positioning System (DGPS), as is well known to those of ordinary skill in the art. The location detector 602 may also be capable of determining location by placing or triangulating land base stations of a conventional cellular network, or determining location from other local area networks such as Wi-Fi networks.
The other physical sensor 120 is a near field communication circuit 603. Near field communication circuitry 603 may be included for communicating with a local area network to receive information regarding the context of the environment in which the electronic device is located. By way of example, the near field communication circuit 603 may obtain information such as weather information and location information. For example, if the user is at a museum, they may stand near an exhibit that can be identified using near field communication. The identification may indicate that the electronic device is both indoors and at the museum. Thus, if the user requests additional information about the artist or drawing, there is a higher probability that the device command requesting the one or more processors (114) to search is higher than the information searched using a web browser. Alternatively, near field communication circuitry 603 may be used to receive context information from kiosks and other electronic devices. The near field communication circuitry 603 may also be used to obtain images or other data from a social media network. Examples of suitable near field communication circuits include bluetooth communication circuits, IEEE 801.11 communication circuits, infrared communication circuits, magnetic field modulation circuits, and Wi-Fi circuits.
Another example of a physical sensor 120 is a motion detector 604. By way of example, an accelerometer, gyroscope, or other device may be used as the motion detector 604 in the electronic device. Using an accelerometer as an example, an accelerometer may be included to detect movement of an electronic device. Additionally, accelerometers may be used to sense some gestures of a user, such as a person talking, running, or walking with their hand.
The motion detector 604 may also be used to determine the spatial orientation of the electronic device by detecting the direction of gravity, as well as the spatial orientation in three dimensions. In addition to, or in lieu of, the accelerometer, an electronic compass may be included to detect the spatial orientation of the electronic device relative to the earth's magnetic field. Similarly, one or more gyroscopes may be included to detect rotational movement of the electronic device.
Another example of a physical sensor 120 is a force sensor 605. The force sensor may take various forms. For example, in one embodiment, the force sensor includes a resistive switch or force switch array configured to detect contact with a display or housing of the electronic device. The resistive switch array may be used as a force sensing layer because any change in the impedance of the switch may be detected when in contact with the surface of the display or the housing of the electronic device. The switch array may be any of the following: a resistance sensing switch, a membrane switch, a force sensing switch such as a piezoelectric switch, or other equivalent type of technology. In another embodiment, the force sensor may be capacitive. In yet another embodiment, the piezoelectric sensor may also be configured to sense a force. For example, in the case of coupling with a lens of a display, a piezoelectric sensor may be configured to detect an amount of displacement of the lens to determine the force. The piezoelectric sensor may also be configured to determine a force contacting a housing of the electronic device instead of the display.
Another example of physical sensor 120 includes a proximity sensor. The proximity sensor falls into one of two camps: active proximity sensors and "passive" proximity sensors. These are shown in fig. 6 as a proximity detector component 606 and a proximity sensor component 607. The proximity detector component 606 or the proximity sensor component 607 can be generally employed for gesture control and other user interface protocols, some examples of which are described in more detail below.
As used herein, a "proximity sensor component" includes only a signal receiver that does not include a corresponding transmitter for transmitting a signal to the signal receiver for reflection off of an object. Due to the fact that the body of the user or other heat generating object external to the device, such as a wearable electronic device worn by the user, is used as a transmitter, only a signal receiver may be used. By way of illustration, in one embodiment, the proximity sensor component 607 includes a signal receiver for receiving signals from an object external to the housing of the electronic device. In one embodiment, the signal receiver is an infrared signal receiver for receiving infrared emissions from an object such as a person when the person is placed in proximity to the electronic device. In one or more embodiments, the proximity sensor component is configured to receive infrared wavelengths of about 4 to about 10 microns. This wavelength range is advantageous in one or more embodiments because it corresponds to the wavelength of the heat emitted by the human body.
Additionally, detection of wavelengths in this range may be performed from a greater distance than, for example, detection of reflected signals from emitters of the proximity detector assembly. In one embodiment, the proximity sensor component 607 has a relatively long detection range to detect heat emitted from a person when the person is within a predetermined heat receiving radius. For example, in one or more embodiments, the proximity sensor component is capable of detecting a person's body temperature from a distance of approximately 10 feet. The ten foot size can be extended depending on the designed optics, sensor active area, gain, lens gain, etc.
The proximity sensor component 607 is sometimes referred to as a "passive IR system" due to the fact that a person is an active emitter. Thus, the proximity sensor component 607 does not require a transmitter, as an object disposed outside the housing conveys emissions received by the infrared receiver. Because no transmitter is required, each proximity sensor component 607 can operate at very low power levels.
In one embodiment, the signal receiver of each proximity sensor component 607 may operate at various sensitivity levels such that at least one proximity sensor component 607 is operable to receive infrared emissions from different distances. For example, the one or more processors (114) may cause each proximity sensor component 607 to operate at a first "effective" sensitivity to receive infrared emissions from a first distance. Similarly, the one or more processors (114) may cause each proximity sensor component 607 to operate at a second sensitivity that is less than the first sensitivity to receive infrared emissions from a second distance that is less than the first distance. The sensitivity change may be made by having one or more processors (114) interpret the readings from the proximity sensor component 607 differently.
In contrast, the proximity detector component 606 includes a signal transmitter and a corresponding signal receiver. While each proximity detector component 606 can be any of various types of proximity sensors, such as, but not limited to, capacitive, magnetic, inductive, optical/optoelectronic, imager, laser, acoustic/acoustic, radar-based, doppler-based, thermal, and radiation-based proximity sensors, in one or more embodiments, the proximity detector component 606 includes an infrared emitter and receiver. In one embodiment, the infrared emitter is configured to emit an infrared signal having a wavelength of about 860 nanometers that is one to two orders of magnitude shorter than the wavelength received by the proximity sensor component. The proximity detector component may have a signal receiver that receives a similar wavelength, i.e., about 860 nanometers.
In one or more embodiments, each proximity detector component 606 can be a collection of infrared proximity sensors that use a signal emitter that emits an infrared beam that is reflected from a nearby object and received by a corresponding signal receiver. For example, the proximity detector component 606 can be used to calculate a distance to any nearby object from characteristics associated with the reflected signal. The reflected signal is detected by a corresponding signal receiver, which may be an infrared photodiode, for detecting reflected Light Emitting Diode (LED) light, responding to a modulated infrared signal, and/or performing triangulation of the received infrared signal.
Another example of a physical sensor is a moisture detector 608. The moisture detector 608 may be configured to detect an amount of moisture on or around a display or housing of the electronic device. This may indicate various forms of context. Sometimes it may indicate rain or heavy rain in the environment surrounding the electronic device. Thus, if the user is crazy asking for "Call a cab-! (calling taxi |) ", the fact that moisture is present may increase the likelihood that the query is a device command. The moisture detector 608 may be implemented in the form of an impedance sensor that measures impedance between the electrodes. Since moisture may be due to external conditions, such as rain or user conditions, sweat, moisture detector 608 may operate with ISFETS configured to measure the amount of pH or NaOH in the moisture or current sensor 609 that determines not only the amount of moisture but also whether the moisture is due to external factors, or sweat, or a combination thereof.
The smart imager 610 may be configured to capture an image of an object and determine whether the object matches a predetermined criterion. For example, the smart imager 610 operates as an identification module configured with optical recognition such as including image recognition, character recognition, visual recognition, facial recognition, color recognition, shape recognition, and the like. Advantageously, the intelligent imager 610 may be used as a facial recognition device to determine the identity of one or more persons detected with respect to the electronic device.
For example, in one embodiment, when one or more proximity sensor assemblies 607 detects a person, the intelligent imager 610 may capture a photograph of the person. The smart imager 610 may then compare the image to a reference file stored in memory (115) to confirm whether the person's face sufficiently matches the reference file beyond a threshold probability of authenticity. Advantageously, the optical discrimination allows the one or more processors (114) to perform the control operation only when one of the detected persons with respect to the electronic device is sufficiently identified as the owner of the electronic device.
The intelligent imager 610 may operate in other ways besides taking photographs. For example, in some embodiments, the smart imager 610 may capture multiple sequential images to capture more information that may be used to determine social cues. Alternatively, the smart imager 610 may capture or video frames with or without metadata, such as motion vectors. This additional information captured by the smart imager 610 may be used to detect richer social cues that may be inferred from the captured data.
Barometer 611 may sense changes in air pressure due to environmental and/or weather changes. In one embodiment, the barometer 611 includes a cantilever mechanism made of piezoelectric material and disposed within the chamber. The cantilever mechanism operates as a pressure sensitive valve, flexing as the pressure differential between the chamber and the environment changes. When the pressure difference between the chamber and the environment is zero, the deflection of the cantilever ceases. Since the cantilever material is piezoelectric, the deflection of the material can be measured with an electrical current.
The gaze detector 612 may include a sensor for detecting a user gaze point. The gaze detector 612 may optionally include a sensor for detecting alignment of the user's head in three-dimensional space. The electronic signals may then be transferred from the sensor to a gaze detection process for calculating the direction of the user's gaze in three-dimensional space. The gaze detector 612 may also be configured to detect a gaze cone corresponding to the detected gaze direction, which is a field of view in which a user may easily see without diverting their eyes or head from the detected gaze direction. The gaze detector 612 may be configured to alternatively estimate gaze direction by inputting an image representing a photograph of a selected area near or around the eyes to a gaze detection process. It will be apparent to those of ordinary skill in the art having the benefit of this disclosure that these techniques are merely illustrative, as other modes of detecting gaze direction may be substituted in the gaze detector 612 of fig. 6.
The light sensor 613 can detect a change in light intensity, color, light, or shade in the environment of the electronic device. This may be used to make inferences about the context, such as weather or other cues. For example, if the light sensor 613 detects a low light condition in the middle of the day that the location detector 602 indicates that the electronic device is outside, this may be due to a cloudy condition, fog, or haze. An infrared sensor may be used in conjunction with light sensor 613 or in place of light sensor 613. The infrared sensor may be configured to detect thermal emissions from an environment surrounding the electronic device. For example, in the event that the infrared sensor detects warm day heat but the light sensor detects a low light condition, this may indicate that the electronic device is in a room where the air conditioner is not properly set. Similarly, the temperature sensor 614 may be configured to monitor the temperature surrounding the electronic device.
The physical sensor 120 may also include an audio capture device 615. In one embodiment, the audio capture device 615 includes one or more microphones for receiving acoustic input. While one or more microphones may be used to sense voice inputs, voice commands, and other audio inputs, in some embodiments they may be used as environmental sensors to sense environmental sounds such as rain, wind, and the like.
In one embodiment, the one or more microphones comprise a single microphone. However, in other embodiments, the one or more microphones may include two or more microphones. Where multiple microphones are included, they may be used for selective beam steering, for example, to determine from which direction sound emanates. By way of illustration, a first microphone may be located on a first side of the electronic device for receiving audio input from a first direction, and a second microphone may be located on a second side of the electronic device for receiving audio input from a second direction. One or more processors (114) may then select between the first microphone and the second microphone to beam steer audio reception toward the user. Alternatively, one or more processors (114) may process and combine signals from two or more microphones to perform beam steering.
In one embodiment, the audio capture device 615 includes an "always on" audio capture device. In this way, the audio capturing device 615 is able to capture audio input whenever the electronic device is operating. As described above, in one or more embodiments, one or more processors, which may include a digital signal processor, may identify whether one or more device commands are present in the audio input captured by the audio capture device 615.
Another example of a physical sensor 120 is a hygrometer 616. Hygrometer 616 may be used to detect humidity, which may indicate that the user is outdoors or is sweating. As noted above, the illustrative physical sensor of fig. 6 is not comprehensive. Many other physical sensors may be added. For example, a wind speed monitor may be included to detect wind. Accordingly, the physical sensor 120 of fig. 6 is merely illustrative, as many other sensors will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Turning briefly now to fig. 7, various examples of context sensors 121 are illustrated. As with fig. 6, the example shown in fig. 7 does not constitute a comprehensive list. Many other context sensors 121 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one embodiment, emotion detector 701 may infer the emotion of a person based on context information received from physical sensors (120). For example, if the smart imager (501) captures a picture, a plurality of consecutive pictures, video, or other information from which a person may be identified as the owner of the electronic device and she is crying in the picture, plurality of consecutive pictures, video, or other information, the emotion detector 701 may infer that she is happy or sad. Similarly, if the audio capture device captures the user's voice and the user is shouting or Curse, the emotion detector 701 may infer that the user may be angry or distracted.
Emotion detector 702 may operate in a similar manner to infer an emotional state of a person from context information received from physical sensors (120). By way of example, if the smart imager (501) captures a picture, a plurality of consecutive pictures, video, or other information related to the owner of the electronic device, the emotion detector 702 may infer an emotional state of its silent delivery, e.g., fun, anger, frustration, etc. This may be inferred from, for example, facial gestures (e.g., raised eyebrows, exposed teeth, laughing, or other features). In one or more embodiments, such emotional cues may indicate that a user intends to issue commands to an electronic device. Alternatively, emotion may be detected from voice changes or words used. For example, if someone is screaming "I am mad at you (I am very your breath)", negative affective problems may be involved.
Calendar information and events 720 may be used to detect social cues. For example, if a calendar event indicates that a birthday party is occurring, this may suggest a happy and pleasant social cue. However, if a funeral is occurring, it will not be possible for the user to issue a device command to the electronic device, as the funeral is often a quiet transaction.
Health information 703 may be used to detect social cues. For example, if the health information 703 indicates that the person's heart rate is high and they are sweating, and the location information 715 indicates that the person is in the aisle of a city, and the time of day information 708 indicates to be 3 am, then the person may be at risk. Thus, the command "Call 911 (Call 911)" is likely to be a device command.
Alert information 704 may be used to detect social cues. If the alarm just sounded at 6:00 am, the command "snooze" may be a device command. Personal identification information 705 may also be used to detect social cues. If the person is diabetic and the health sensor shows that they are wet cold and hyperhidrosis, this may be due to low insulin. Thus, the command "Call 911" is likely to be a device command.
The device usage data 706 may indicate social cues. If a person is searching the web and an incoming call is received, the command "reject" may be a device command. The energy store 707 within the electronic device can be used to indicate social cues. The device operation mode information 709 may be used in a similar manner. When energy storage drops to, for example, ten percent, commands "shut down all non-CRITICAL APPS (close all non-critical apps)" may be device commands.
Consumer purchase information 711 may positively indicate a social prompt. For example, if a person is a brewer and often purchases wine, the command "buy that wine now (now purchase the wine)" may be a device command when viewing the web browser and finding a bottle of 82 years of raffe below $1000.
The device usage profile 712 may also be used to infer social cues. For example, if a person never uses an electronic device between 10:00 pm and 6:00 am due to the fact that they are sleeping, if they are talking during their sleep, and say "order a pizza-I' M STARVING (order pizza-me hungry)", this is not likely a device command.
Organizations may have formal rules and policies 710, such as meetings cannot last for more than one hour without interruption, people must have lunch breaks between noon and 2:00 pm, and a cerebral storm session occurs between 9:00 am and 10:00 am every morning. Similarly, a household may have similar rules and policies 713, such as dinner occurring between 6:00 and 7:00 pm. This information may be used to infer social cues, such as whether a person may be talking to other people. When this is the case, the verbal question is unlikely to be a device command. In contrast, when the user is likely to be alone, the verbal command is more likely to be a device command.
The application data 734 may indicate social cues. If people interact with word processing applications often during the day, the commands "cut" and "paste" are more likely to be the device commands they need for those who instead toy a video game with a bird. The device settings 716 may also indicate social cues. If a user sets their electronic device to alert mode, it is possible that they are sleeping and do not issue device commands.
Social media 718 in the information may indicate social cues. For example, in one embodiment, information related to multi-modal social cues from the environment surrounding the electronic device may be inferred by retrieving information from a social media server. For example, a real-time search, which may be a keyword search, an image search, or other search, of a social media service may find images, posts, and comments related to the place determined by place information 715. Images posted on a social media service server taken at the same location may reveal multi-modal social cues. Alternatively, comments about the location may suggest social cues. Information from third party server 717 may also be used in this manner.
Yet another example of a context sensor 121 is repetitive behavior information 719. For example, if a person is always stopped at a coffee shop between 8:00 a.m. and 8:15 a.m. on their way to work, the command "Pay for the coffee (pay for coffee)" may be a device command. As with fig. 6 above, the physical sensors of fig. 6 do not form a comprehensive list. The context sensor 121 may be any type of device that infers a context from data of an electronic device. The context sensor 121 may be configured as a hardware component or alternatively as a combination of hardware and software components. The context sensor 121 may analyze the information, for example, to not only detect the user, but also determine social cues and emotional effects of other people in the vicinity of the electronic device, further informing inferences about the user's intent and which executable control commands are appropriate in the case of the compound social context.
The context sensor 121 may be configured to collect and analyze non-physical parameter data. Although some are illustrated in fig. 7, many others may be added. Accordingly, the context sensor 121 of fig. 7 is merely illustrative, as many other sensors will be apparent to those of ordinary skill in the art having the benefit of this disclosure. It should be noted that one or both of the physical sensors (120) or the context sensors 121, when used in combination, may be cascaded in a predetermined order to detect multiple multi-modal social cues to determine whether a device command is intended for an electronic device.
Returning now to FIG. 1, in one or more embodiments, heuristic sensor processor 119 may operate with physical sensors 120 and context sensors 121 to detect, infer, capture, and otherwise determine when multi-modal social cues are present in the environment surrounding an electronic device. In one embodiment, heuristic sensor processor 119 determines the estimated context and framework from one or both of physical sensor 120 or context sensor 121 using an adjustable algorithm employing a context estimate of information, data, and events. These evaluations can be learned by repeating data analysis. Alternatively, the user may employ the user interface of the electronic device 100 to enter various parameters, constructs, rules, and/or examples that instruct or otherwise direct the heuristic sensor processor 119 to detect multimodal social cues, emotional states, moods, and other contextual information. In one or more embodiments, heuristic sensor processor 119 may include an artificial neural network or other similar technique.
In one or more embodiments, the heuristic sensor processor 119 may operate with the one or more processors 114. In some embodiments, one or more processors 114 may control heuristic sensor processor 119. In other embodiments, heuristic sensor processor 119 may operate independently to communicate information collected from detecting multi-modal social cues, emotional states, moods, and other contextual information to one or more processors 114. Heuristic sensor processor 119 may receive data from one or both of physical sensor 120 or context sensor 121. In one or more embodiments, the one or more processors 114 are configured to perform the operations of the heuristic sensor processor 119.
In one or more embodiments, the block diagram illustration 112 includes a voice interface engine 122. In one embodiment, the voice interface engine 122 may include hardware, executable code, and voice monitor executable code. The voice interface engine 122 may include a basic speech model, a trained speech model, or other module used by the voice interface engine 122 to receive and recognize voice commands received with audio input captured by an audio capture device, stored in the memory 115. In one embodiment, the voice interface engine 122 may include a voice recognition engine. Regardless of the particular implementation used in the various embodiments, the voice interface engine 122 may access various speech models to recognize voice commands.
In one embodiment, the voice interface engine 122 is configured to implement voice control features that allow a user to speak specific device commands to cause the one or more processors 114 to perform control operations. For example, the user may say "How TALL IS THE WILLIS Tower? (how tall the wilis. Thus, the device commands may cause the one or more processors 114 to access an application module, such as a web browser, to search for an answer, which is then passed as audible output via the audio output of the other component 124. Briefly, in one embodiment, the voice interface engine 122 listens for voice commands, processes the commands, and returns an output in conjunction with the one or more processors 114 as a result of the user's intent.
The block diagram 112 may also include an image/gaze detection processing engine 123. The image/gaze detection processing engine 123 may operate with a physical sensor 120, such as a camera or smart imager, to process information to detect a gaze point of a user. The image/gaze detection processing engine 123 may optionally include a sensor for detecting alignment of the user's head in three-dimensional space. The electronic signals may then be passed from the sensor to an image/gaze detection processing engine 123 for calculating the user's gaze direction in three-dimensional space. The image/gaze detection processing engine 123 may also be configured to detect a gaze cone corresponding to a detected gaze direction, which is a field of view that a user can easily see without diverting their eyes or head from the detected gaze direction. The image/gaze detection processing engine 123 may be configured to alternatively estimate gaze direction by inputting an image representing a photograph of a selected area near or around the eyes.
The one or more processors 114 may also generate commands or perform control operations based on information received from a combination of physical sensors 120, context sensors 121, flexible display 104, other components 124, and/or other input devices. Alternatively, the one or more processors 114 may generate commands or perform control operations based solely on information received from the one or more sensors or flexible displays 104. Further, the one or more processors 114 may process the received information alone or in combination with other data, such as information stored in the memory 115.
Other components 124 operable with the one or more processors 114 may include output components such as video output, audio output, and/or mechanical output. Examples of output components include audio outputs such as speaker ports, earphone speakers, or other annunciators and/or buzzers and/or mechanical output components such as vibration mechanisms or motion-based mechanisms. Still other components will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
As described above, in one or more embodiments, the blade assembly 102 is coupled to the flexible display 104. In contrast to a sliding device that includes multiple device housings, the electronic device 100 of fig. 1 includes a single device housing 101 to which the blade assembly 102 is coupled. The blade assembly 102 is configured as a mechanical chassis that allows the flexible display 104 to translate along a translation surface defined by the major and minor surfaces of the single device housing 101. In one or more embodiments, when the blade assembly 102 and flexible display 104 are in the extended position shown in fig. 1, the blade assembly 102 also provides mechanical support for the portion 130 of the flexible display 104 that extends beyond the top edge 131 of the single device housing 101. When the display roller mechanism 105 is actuated, it translates 301 the blade assembly 102 and flexible display 104 along the rear major surface 103, the bottom minor surface, and the front major surface between the extended position shown in fig. 1, the retracted position shown in fig. 3, and the peeping position shown in fig. 5.
The blade assembly 102 may include a blade substrate 125, the substrate 125 including a flexible portion and a rigid portion and being positioned between the flexible display 104 and a translation surface defined by the single device housing 101. The blade substrate 125 may also include a silicone boundary 127 that surrounds and protects the edges of the flexible display 104. In one or more embodiments, the blade substrate 125 includes a steel support plate with a silicone boundary 127 co-molded around the perimeter of the steel support plate. In one or more embodiments, the low friction dynamic bending lamination stack 128 and the blade 126 are positioned between the blade assembly 102 and a translation surface defined by the single equipment housing 101.
In one or more embodiments, the blade substrate 125 is partially rigid and partially flexible. By way of illustration, the portions of the blade substrate 125 that slide along the major surfaces of the individual device housings 101 are configured to be substantially rigid, while the portions of the blade substrate 125 that bypass the minor surfaces of the individual device housings 101 are configured to be flexible such that they may be curled around these minor surfaces. In one or more embodiments, some portions of the vane substrate 125 abut the translation surface defined by the single device housing 101, while other portions abut the display roller mechanism 105, in this illustrative embodiment, the display roller mechanism 105 is positioned at the bottom minor surface of the single device housing 101.
In one or more embodiments, the blade 126 and the low friction dynamic bending lamination stack 128 are positioned between the blade assembly 102 and a translation surface defined by the single equipment housing 101. When the blade assembly 102 is transitioned to the extended position shown in FIG. 1, the blade 126 supports the blade assembly 102 and the portion of the flexible display 104 that extends beyond the top edge 131 of the single device housing 101. Since the blade 126 needs to be rigid to support those portions of the blade assembly 102 and flexible display 104, it cannot bend around the display roller mechanism 105. To prevent gaps or steps from occurring at the termination of the blade 126, in one or more embodiments, the low friction dynamic bending lamination stack 128 spans the remainder of the blade assembly 102 and abuts the transition surface defined by the single equipment housing 101.
The blade assembly 102 may be fixedly coupled to the flexible display 104 by an adhesive or other coupling mechanism. Wherein the blade substrate 132 defines both rigid and flexible portions. The blade substrate 132 may define a first rigid segment extending along a major surface of the single device housing 101 and a second flexible portion configured to extend around a minor surface of the single device housing 101 where the display roller mechanism 105 is positioned.
In one or more embodiments, the blade assembly 102 defines a mechanical assembly that provides a slider frame that allows the flexible display 104 to move between the extended position of fig. 1, the retracted position of fig. 3, and the peeping position of fig. 5. As used herein, the term "frame" is defined in the plain english language of a mechanical support structure that supports other components coupled to the slider frame. These components may include a blade 126, a silicone boundary 127, and a low friction dynamic bending lamination stack 128. Other components may also be included. By way of example, this may include electronic circuitry for powering the flexible display 104. In one or more embodiments, it may also include a tensioner that ensures that the flexible display 104 remains flat against the single device housing 101 when translated.
In one or more embodiments, the display roller mechanism 105 causes a first portion of the blade assembly 102 and the flexible display 104 to display (shown on the rear side of the electronic device 100 in fig. 1) and a second portion of the blade assembly 102 and the flexible display 104 (positioned on the front side of the electronic device 100 in fig. 1) to symmetrically slide in opposite directions along a translation surface defined by the single device housing 101.
Thus, the electronic device 100 of fig. 1 includes a single device housing 101 having a flexible display 104 incorporated into the blade assembly 102. The blade assembly 102 is then coupled to a translation mechanism defined by the display roller mechanism 105 and is located within the single device housing 101. In the illustrative embodiment of fig. 1, the display roller mechanism 105 is located at the bottom edge of the single device housing 101.
In one or more embodiments, in response to receipt of a swipe gesture as will be described in more detail below with reference to fig. 23-27, the translating mechanism defined by the display roller mechanism 105 is operable to transition the blade assembly 102 around the surface of the single device housing 101 between the extended position of fig. 1 (shown in fig. 3) in which the blades 126 of the blade assembly 102 extend distally from the single device housing 101 and the retracted position (shown in fig. 3) in which the blade assembly 102 abuts the single device housing 101 with the flexible display 104 surrounding the surface of the single device housing 101. The translation mechanism may also optionally translate the blade assembly 102 to a "peeping" position (as shown in fig. 5) where movement of the translation mechanism defined by the display roller mechanism 105 causes the blade assembly 102 to reveal an image capture device positioned below the blade assembly 102 in front of the single device housing 101.
In other embodiments, as will be described below, translation of the blade assembly 102 may be initiated by operation of the user interface component 110. Embodiments of the present disclosure contemplate that in such an electronic device 100, manual actuation of the user interface component 110 potentially delays the usability of the electronic device 100 in a new state due to the time it takes to manually "inject" a trigger through the transition that requires actuation of the user interface component 110 to cause the blade assembly 102 and the flexible display 104.
Advantageously, embodiments of the present disclosure provide systems and methods for automatically and preemptively moving flexible display 104 to an optimal state based on swipe gestures communicated to flexible display 104 rather than requiring operation of user interface component 110. By way of illustration, in one or more embodiments, the one or more processors 114 of the electronic device 100 use the flexible display 104 to detect user input defining a swipe gesture. In one or more embodiments, the one or more processors 114 then cause the translation mechanism to translate the blade assembly 102 toward the extended position or the retracted position depending on the direction of the swipe gesture. In one or more embodiments, the translation in response to the swipe gesture occurs only when the electronic device 100 is in the unlocked state when the swipe gesture is received.
After such translation, the one or more processors 114 may optionally present the new content. In one or more embodiments, upon receiving user input, the one or more processors 114 present a home screen presentation on a front portion of the flexible display 104 and, after panning, may present additional content on a front area of the flexible display 104 that is exposed by the panning. In one or more embodiments, the additional content includes an application tray including one or more user actuation targets corresponding to the disclosed application on the disclosed front area.
The one or more user actuation targets may define a collection of applications placed within the application tray. In one or more embodiments, the set of applications may include a set of user-defined applications. In other embodiments, the application set may include a set of frequently used applications. In still other embodiments, the application set may include all applications available for operation on the one or more processors 114.
In other embodiments, the one or more processors 114 may cause the blade assembly 102 and the flexible display 104 to transition to the extended position when the user communicates a sweeping gesture while the application is operating in the foreground. By way of illustration, a user may communicate a swipe gesture in an email application to open an editor, such as writing an email or writing a text message.
Translation of the blade assembly 102 and flexible display 104 to the retracted position may occur in a similar manner. In one or more embodiments, the one or more processors 114 of the electronic device 100 can automatically translate the blade assembly 102 and the flexible display 104 back to the retracted position when the user conveys additional user input defining another swipe gesture occurring in the opposite direction of the received first swipe gesture. When this occurs, the one or more processors 114 may cause the flexible display 104 to move toward the retracted position.
Advantageously, embodiments of the present disclosure provide for intuitive operation of a translation display in electronic device 100. In the case where an automatic translation of the translation display is triggered, the only user action required to translate the display to change position is a simple swipe gesture. Thereafter, the device automatically changes to the location that the user potentially wishes.
In one or more embodiments, the one or more processors 114 utilize this capability to automatically resize the visual display from the front of the electronic device 100 to accommodate the presentation of additional content items not previously presented on the flexible display 104. Once the additional amount of area of the flexible display 104 is revealed by the panning, the one or more processors 114 may then render the content on the flexible display 104. Advantageously, this ability to open an application tray provides additional content or user interface control in an active foreground application, or simply accommodates the presentation of additional content by changing the position of the blade assembly 102 relative to the single device housing 101 in response to a swipe gesture, ensures that the user easily accesses the desired content without clutter of unwanted content.
In one or more embodiments, applications operating on one or more processors 114 may identify characteristics common to their operating contexts with one or more processors 114, the one or more processors 114 causing the display roller mechanism 105 to move the blade assembly 102 in response to the swipe gesture to accommodate content having those characteristics. Advantageously, this allows content to be presented according to the best user experience of the design of a given application.
Thus, in one or more embodiments, the one or more processors 114 facilitate automatic optimal display sizing that occurs in response to a swipe gesture. The one or more processors 114 may then cause the display roller mechanism 105 to adjust the blade assembly 102 such that a front portion of the flexible display 104 is optimized for display size based on the relevant content indicated by the application operating in the foreground.
As shown in fig. 1, the blade assembly 102 is capable of sliding around a single device housing 101 such that the blade 126 slides away from the single device housing 101 to change the overall length on the surface of the flexible display 104 as viewed from the front of the electronic device 100. In contrast, in other states (such as the state shown in fig. 3), the blade assembly 102 may be slid around the single device housing 101 in opposite directions to a retracted position, where a similar amount of flexible display 104 is visible on the front side of the electronic device 100 and the back side of the electronic device 100.
In fig. 1, the electronic device 100 includes a single device housing 101 having a blade assembly 102, the blade assembly 102 coupled to both major surfaces of the single device housing 101 and surrounding at least one minor surface of the electronic device 100 where a display roller mechanism 105 is located. This allows the blade assembly 102 to slide relative to the single device housing 101 between the retracted position of fig. 3, the extended position of fig. 1, and the peeping position of fig. 5, which reveals the front image capture device.
It is to be understood that fig. 1 is provided for illustration purposes only and for illustrating components of one electronic device 100 according to embodiments of the present disclosure, and is not intended to be a complete schematic diagram of the various components required for the electronic device. Thus, other electronic devices according to embodiments of the present disclosure may include various other components not shown in fig. 1 or may include a combination of two or more components or separate components that separate particular components into two or more, and still be within the scope of the present disclosure.
Turning now to fig. 2, there is illustrated the electronic device 100 in an extended position 200, which is also shown in fig. 1. In the extended position 200, the blade (126) slides outwardly and away from the single device housing 101, revealing more and more of the flexible display 104. In this configuration, the portions of the flexible display 104 that bypass the display roller mechanism (105) elongate to a flat position as they pass along a translation surface defined by the front of the single device housing 101.
Turning now to fig. 3-4, an electronic device 100 having a flexible display 104 in a retracted position 300 is illustrated. Fig. 3 illustrates the front side of the electronic device 100, while fig. 4 illustrates the back side.
In this state, the blade (126) slides rearward and then slides along the translation surface defined by the single device housing 101. This causes the overall length on the surface of the flexible display 104 to shorten as more and more of the flexible display 104 bypasses the display roller mechanism (105) positioned at the bottom of the single device housing 101 and spans the translating surface defined by the rear side of the single device housing 101.
Turning now to fig. 5, an electronic device 100 having a flexible display in a peeping position 500 is illustrated. When in the peeping position, the blade assembly 102 and flexible display 104 translate through the retracted position (300) of fig. 3. In one or more embodiments, when this occurs, the blade assembly 102 and flexible display 104 reveal an image capture device 501 that is positioned below the blade assembly 102 and flexible display 104 when they are in the retracted position (300) of FIG. 3. In the illustrative embodiment, a speaker 502 is also disclosed.
Advantageously, by positioning the image capturing device 501 below the retracted position (300) of fig. 3-4 or the extended position (200) of fig. 2 when the blade assembly 102 and flexible display 104 are in these components, privacy of the user of the electronic device 100 is ensured due to the fact that the image capturing device 501 is not visible through the blades (126) of the blade assembly 102. Thus, even if the electronic device 100 is accessed by a hacker or other bad actor, the user may ensure that the image capture device 501 cannot capture images or video when the blade assembly 102 and flexible display 104 are in the retracted position (300), the extended position (200), or a position therebetween. The image capture device 501 is only able to capture a front image or front video when the blade assembly 102 and flexible display 104 transition to the peeping position 500 thereby revealing the image capture device 501.
Referring collectively to fig. 2-5, it can be seen that the electronic device 100 includes a single device housing in which the flexible display 104 is incorporated into the blade assembly 102. The vane assembly 102 is coupled to a translation mechanism located within a single equipment enclosure 101.
In response to actuation of the user interface device, one example of the user interface device is a button positioned on one side of the single device housing 101, or alternatively, in response to a sweeping gesture as described below with reference to fig. 23-27, the translating mechanism is operable to transition the blade assembly 102 around a surface of the single device housing 101 between an extended position 200, a retracted position 300, optionally a peeping position 500, or even between them, in which extended position 200 the blade (126) of the blade assembly 102 extends distally from the single device housing 101, in which retracted position 300 the blade assembly 102 abuts the single device housing 101 with the flexible display 104 and the blade assembly 102 surrounds the surface of the single device housing 101, movement of the translating mechanism in the peeping position 500 causes the blade assembly 102 to reveal an image capturing device 501 (and in this example a speaker 502) located below the blade assembly 102, in a position between them such as would be the case when one or more processors (114) of the electronic device 100 attempt to accommodate a presentation corresponding to the opening of an application tray on the flexible display 104.
Another feature that can be seen in viewing fig. 2-5 collectively is how the presentation of content varies depending on the position of the blade assembly 102. Embodiments of the present disclosure contemplate that the position of the blade assembly 102 and flexible display 104 relative to the single device housing 101 changes the amount of flexible display 104 visible from the front, from the back, and at the curved end. In other words, the visual size of the flexible display 104 from each side of the electronic device 100 will vary depending on the position of the blade assembly 102 relative to the single device housing 101. Advantageously, embodiments of the present disclosure provide applications, methods, and systems for dynamically resizing and adjusting interface layouts and content presentations.
This may be accomplished by sizing a main viewable portion of the flexible display 104, such as the front portion shown in fig. 2,3, and 5. An application may be windowed over this main area of the flexible display 104 that will resize with the flexible display 104 as the flexible display 104 transitions between the extended position 200 of fig. 2, the retracted position 300 of fig. 3-4, and the peeping position 500 of fig. 5.
In fig. 2-5, one or more processors (114) of the electronic device 100 divide the flexible display 104 into three separate, usable parts. These include the front portion of the flexible display 104 shown in fig. 2, 3 and 5, the rear portion of the flexible display 104 shown in fig. 5, and the curved portion of the flexible display 104 at the bottom of the electronic device 100 and around the rotor, as shown in fig. 2-5. This curved portion of the flexible display 104 is sometimes referred to as the "curled" portion of the display.
In one or more embodiments, each of these available parts is dynamically remapped when the flexible display 104 changes position relative to the single device housing 101. In one or more embodiments, an application may request a window over an available portion on which the application intends to present content.
In one or more embodiments, the orientation of the rear portion and the curled portion is different from the orientation of the front portion when the flexible display 104 is translated along the single device housing 101 from the extended position 200 shown in fig. 2 to the retracted position 300 shown in fig. 3-4 or the peeping position 500 of fig. 5. To address this problem, as can be seen by comparing fig. 3-4, in one or more embodiments, the content presented on the rear portion is rotated 180 degrees such that the "up" side of the rear portion is the same as the "up" side on the front portion.
In one or more embodiments, the orientation of the content presented on the hemming portion may vary based on the orientation of the electronic device 100. For example, if the front side is up, the orientation of the content presented on the roll will have a first orientation. In contrast, if the back side is up, the orientation of the same content presented on the roll will have a second orientation rotated 180 degrees relative to the first orientation.
In one or more embodiments, any content presented on the front portion of the flexible display 104 is directed according to user preferences. In one or more embodiments, the front portion is oriented according to the orientation of the electronic device 100 in three-dimensional space.
On the curled portion of the translating display, in one or more embodiments, when the electronic device 100 is oriented without the front side facing the negative z-direction in three-dimensional space (which is rotated 180 degrees when this is the case), the segment is oriented in the same orientation as the front portion. In one or more embodiments, the hemming portion is not subject to user preference for display orientation and auto-rotate/device orientation.
In one or more embodiments, when the electronic device 100 is held vertically, the content presented on the rear portion of the flexible display 104 is always rotated 180 degrees relative to the content presented on the front portion, as is the case in fig. 3-4. In one or more embodiments, the latter portion is not subject to user preferences for display orientation and auto-rotate/device orientation.
Thus, in one or more embodiments, the one or more processors (114) of the electronic device (100) dynamically remaps the plurality of translational display root segments based on the position of the flexible display 104 relative to the single device housing 101. The one or more processors 114 may independently manage the orientation and rotation on each root segment of the flexible display 104, whether they are the front portion, the rear portion, or the curled portion. In one or more embodiments, this management occurs independently based on which side of the electronic device 100 the segment is currently positioned on in combination with the sensor input to identify whether the electronic device 100 is face-down or face-up.
As shown in fig. 2, the blade assembly 102 is operable to slide about the single device housing 101 such that the blade 126 slides away from the single device housing 101 to change the overall length of the flexible display 104 as viewed from the front of the electronic device 100. As will be described below with reference to fig. 23-27, in one or more embodiments this occurs in response to a sweeping gesture. As shown in fig. 3-4, the blade assembly 102 may optionally be slid in an opposite direction around the single device housing 101 to the retracted position 300 in response to another swipe gesture, wherein a similar amount of flexible display 104 may be seen on the front side of the electronic device 100 and the back side of the electronic device 100.
Thus, in one or more embodiments, the electronic device 100 includes a single device housing 101 having a blade assembly 102, the blade assembly 102 coupled to both major surfaces of the single device housing 101 and surrounding at least one minor surface of the electronic device 100 such that the blade assembly 102 is slidable relative to the single device housing 101 between a retracted position 300, an extended position 200, and a peeping position 500 exposing the front image capture device 501.
Turning now to fig. 8, there is illustrated a flexible display 104, shown in an exploded view, along with a blade assembly 102. As shown in fig. 8, in one or more embodiments, the flexible display 104 includes one or more layers that are coupled or laminated together to complete the flexible display 104. In one or more embodiments, these layers include a flexible protective cover 801, a first adhesive layer 802, a flexible display layer 803, a second adhesive layer 804, and a flexible substrate 805. Other configurations of layers suitable for manufacturing the flexible display 104 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Starting from the top of the layer stack, in one or more embodiments, the flexible protective cover 801 comprises an optically transparent substrate. In one or more embodiments, the flexible protective cover 801 may be made of an optically transparent material such as a thin film sheet of thermoplastic material. By way of illustration, in one embodiment, the flexible protective cover 801 is made of an optically transparent polyamide layer having a thickness of about 80 microns. In another embodiment, the flexible protective cover 801 is made of an optically transparent polycarbonate layer having a thickness of about 80 microns. Other materials suitable for making the flexible protective cover 801 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the flexible protective cover 801 functions as a panel (fascia) by defining a cover for the flexible display layer 803. In one or more embodiments, the flexible protective cover 801 is optically transparent in that light can pass through the flexible protective cover 801 so that objects behind the flexible protective cover 801 can be clearly seen. The flexible protective cover 801 may optionally include an ultraviolet light barrier. In one or more embodiments, such a barrier may be used to improve the visibility of the flexible display layer 803.
Beneath the flexible protective cover 801 is a first adhesive layer 802. In one or more embodiments, the first adhesive layer 802 includes an optically clear adhesive. An optically clear adhesive may be applied to both sides of the thin optically clear substrate such that the first adhesive layer 802 functions as an optically clear layer with optically clear adhesive on both sides. So configured, in one or more embodiments, the first adhesive layer 802 has a thickness of about 50 microns. Such optically transparent versions of "double-sided tape" may then be wrapped around and applied between the flexible protective cover 801 and the flexible display layer 803 to couple the two together.
In other embodiments, the first adhesive layer 802 will be applied between the flexible protective cover 801 and the flexible display layer 803 as an optically clear liquid, gel, as a uniform adhesive layer, or alternatively in the form of another medium. So configured, the first adhesive layer 802 may optionally be cured by heat, ultraviolet light, or other techniques. Other examples of materials suitable for use as the first adhesive layer 802 will be apparent to those of ordinary skill in the art having the benefit of this disclosure. In one or more embodiments, the first adhesive layer 802 mechanically couples the flexible display layer 803 to the flexible protective cover 801.
In one or more embodiments, the flexible display layer 803 is located between the flexible substrate 805 and the flexible protective cover 801. In one or more embodiments, the flexible display layer 803 is longer along the long axis 806 of the flexible display layer 803, and thus along the flexible display 104 itself, as compared to the image-producing portion 808 of the flexible display 104. For example, as shown in fig. 8, the flexible display layer 803 includes a T-shaped tab 807 that extends beyond an image producing portion 808 of the flexible display layer 803. As will be shown in fig. 10 below, in one or more embodiments, electronic circuit components, connectors, and other components configured to operate the image-producing portion 808 of the flexible display layer 803 may be coupled to the T-shaped tongue 807 in one or more embodiments. Thus, in this illustrative embodiment, the T-shaped tongue 807 extends distally beyond the terminal ends of the other layers of the flexible display 104. While the T-shaped tongue 807 is T-shaped in this illustrative embodiment, it may take other shapes and will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
The flexible display layer 803 may optionally be touch sensitive. In one or more embodiments, the flexible display layer 803 is configured as an Organic Light Emitting Diode (OLED) display layer. When coupled to the flexible substrate 805, the flexible display layer 803 may be bent according to various bending radii. For example, some embodiments allow a bend radius of between thirty and six hundred millimeters. Other substrates allow a bend radius of about 5mm to provide a display that is foldable by active bending. Other displays may be configured to accommodate both bending and folding.
In one or more embodiments, the flexible display layer 803 may be formed of multiple layers of flexible material, such as a flexible sheet of polymer or other material. By way of example, the flexible display layer 803 may include layers of optically transparent electrical conductors, a polarizer layer, one or more optically transparent substrates, and electronic control circuitry, such as thin film transistors, for actuating pixels and one or more capacitors for energy storage. In one or more embodiments, the flexible display layer 803 has a thickness of about 130 microns.
In one or more embodiments, for touch sensitivity, the flexible display layer 803 includes a layer including one or more optically transparent electrodes. In one or more embodiments, the flexible display layer 803 includes an organic light emitting diode layer configured to image and other information to a user. The organic light emitting diode layer may include one or more pixel structures arranged in an array, each pixel structure including a plurality of electroluminescent elements, such as organic light emitting diodes. These various layers may be coupled to one or more optically transparent substrates of the flexible display layer 803. Other layers suitable for inclusion of the flexible display layer 803 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the flexible display layer 803 is coupled to the flexible substrate 805 through the second adhesive layer 804. In other embodiments, the layers above the flexible display layer 803 may be configured with sufficient rigidity such that the flexible substrate 805 is not necessary. For example, in embodiments where the flexible protective cover 801 is configured with sufficient rigidity to provide sufficient protection for the flexible display 104 during bending, the flexible substrate 805 may be omitted.
In one or more embodiments, the flexible substrate 805 includes a thin steel layer. By way of illustration, in one or more embodiments, the flexible substrate 805 comprises a steel layer having a thickness of about 30 microns. While thin flexible steel works well in practice, other materials may also be used for the flexible substrate 805 as will be apparent to those of ordinary skill in the art having the benefit of this disclosure. For example, in another embodiment, the flexible substrate 805 is fabricated from a thin layer of thermoplastic material.
In one or more embodiments, to simplify manufacturing, the second adhesive layer 804 is the same as the first adhesive layer 802 and includes an optically clear adhesive. However, since the second adhesive layer 804 is coupled between the flexible display layer 803 and the flexible substrate 805, i.e., under the flexible display layer 803, an optically transparent adhesive is not required. In other embodiments, the second adhesive layer 804 may be partially optically clear or not optically clear at all.
Regardless of whether the second adhesive layer 804 is optically clear, in one or more embodiments, the adhesive of the second adhesive layer 804 is applied to both sides of a thin flexible substrate. So configured, in one or more embodiments, the second adhesive layer 804 has a thickness of about 50 microns. This extremely thin version of the "double sided tape" may then be wrapped around and applied between the flexible display layer 803 and the flexible substrate 805 to couple the two together.
In other embodiments, as with the first adhesive layer 802, the second adhesive layer 804 will be applied between the flexible display layer 803 and the flexible substrate as a liquid gel, as a uniform layer, or alternatively in the form of another medium. So configured, the second adhesive layer 804 may optionally be cured by heat, ultraviolet light, or other techniques. Other examples of materials suitable for use as the second adhesive layer 804 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In the illustrative embodiment, the flexible display 104 is supported not only by the flexible substrate 805, but also by the blade assembly 102. As previously described, in one or more embodiments, the blade assembly 102 includes a blade baseplate 125. In one or more embodiments, the blade substrate 125 includes a steel layer. In one or more embodiments, the blade substrate 125 is thicker than the flexible substrate 805. By way of illustration, in one or more embodiments, when the flexible substrate 805 comprises a steel layer having a thickness of about 30 microns, the blade substrate 125 comprises a steel layer having a thickness of about 100 microns.
In one or more embodiments, the blade substrate 125 includes a rigid, substantially planar support layer. By way of illustration, in one or more embodiments, the blade substrate 125 may be fabricated from stainless steel. In another embodiment, the blade substrate 125 is fabricated from a thin rigid thermoplastic sheet. Other materials may also be used in manufacturing the blade substrate 125. For example, nitinol, a material that is a nickel-titanium alloy, may be used to fabricate the blade substrate 125. Other rigid, substantially planar materials will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Thus, the blade substrate 125 defines another mechanical support for the flexible display 104. In one or more embodiments, the blade substrate 125 is the hardest layer of the overall assembly of FIG. 8. In one or more embodiments, the blade substrate 125 is fabricated from stainless steel having a thickness of approximately 100 microns. In another embodiment, the blade substrate 125 is made of a flexible plastic. Other materials from which the blade substrate 125 may be fabricated will be apparent to those of ordinary skill in the art having the benefit of this disclosure. For example, in another embodiment, the blade substrate 125 is fabricated from carbon fiber or the like. In one or more embodiments, the blade substrate 125 includes a reinforced boundary comprising a thicker layer of material to further protect the flexible display 104 when the blade assembly 102 is in the extended position (200).
In one or more embodiments, the flexible substrate 805 is slightly longer along the long axis of the flexible substrate 805 as compared to the image-producing portion 808 of the flexible display 104. Since the T-shaped tongue 807 is T-shaped, this allows one or more holes 809 to be exposed on either side of the base of the T-shaped tongue 807. In one or more embodiments, this additional length along the long axis provided by the flexible substrate 805 allows one or more fasteners to rigidly connect the first end of the flexible substrate 805 to the tensioner.
Embodiments of the present disclosure contemplate that some layers comprising flexible display 104 are stiffer than others. Similarly, other layers of the flexible display 104 are softer than other layers. For example, in the case where the flexible substrate 805 is made of metal, one example thereof is stainless steel, which is harder than the first adhesive layer 802 or the second adhesive layer 804. In one or more embodiments, the stainless steel is also harder than the flexible display layer 803. In one or more embodiments, the flexible substrate 805 is the hardest layer in the flexible display 104, and the first and second adhesive layers 804 are the softest layers of the flexible display 104. In one or more embodiments, the flexible protective cover 801 and the flexible display layer 803 have a stiffness that falls between the flexible substrate 805 and the adhesive layer.
In one or more embodiments, the layers of the flexible display 104 are stacked together in a substantially flat configuration. In other words, in one or more embodiments, the flexible substrate 805 is configured as a substantially planar substrate. A second adhesive layer 804 may be attached to the substantially planar substrate, with the flexible display layer 803 then attached to the second adhesive layer 804. The first adhesive layer 802 may be attached to the flexible display layer 803, with the flexible protective cover 801 attached to the first adhesive layer 802.
To ensure proper coupling, the resulting flexible display layer 803 may be cured, such as in an autoclave (autoclaving) at a predetermined temperature for a predetermined duration. Such curing allows correction of any bubbles or other defects in the individual layers when employed. In one or more embodiments, because the flexible substrate 805 is configured as a substantially planar substrate, the resulting flexible display 104 is also substantially planar.
In one or more embodiments, the blade substrate 125 of the blade assembly 102 includes a flexible portion 810 and a rigid portion 811. Since in one or more embodiments the blade substrate 125 is made of metal, one example of which is steel having a thickness of 100 microns, the rigid portion 811 obtains its rigidity from the material from which it is made. For example, if the blade substrate 125 is made of a thermoplastic material, in one or more embodiments, the thermoplastic material will be sufficiently rigid that the rigid portion 811 will be rigid. Since the rigid portion 811 slides only along the flat main surface of the translation surface defined by the single device housing (101), it does not need to be curved. Furthermore, the rigidity helps to protect the portion of the flexible display 104 that extends beyond the end of the single device housing (101).
In contrast, the flexible portion 810 needs to surround the minor surface of the single device housing (101) where the display roller mechanism (105) is located. Since the flexible portion 810 is made of the same material as the rigid portion 811 when the blade substrate 125 is manufactured as a single, unitary piece, in one or more embodiments the flexible portion 810 includes a plurality of holes cut through the blade substrate 125, allowing the material to flex. By way of example, in one or more embodiments in which the blade substrate 125 is fabricated from steel, a plurality of chemically or laser etched holes may allow the flexible portion 810 to tightly surround the minor surface of a single device housing (101) where the display roller mechanism (105) is located.
Thus, in one or more embodiments, the blade substrate 125 is partially rigid and partially flexible. The portions of the blade substrate 125 that slide along the major surfaces of the individual equipment housings (101) are configured to be substantially rigid, while the portions of the blade substrate 125 that bypass the minor surfaces of the individual equipment housings (101) are configured to be flexible so that they can be curled around these minor surfaces.
In one or more embodiments, the blade assembly 102 also includes a silicone boundary 127 positioned around the perimeter of the blade substrate 125. In one or more embodiments, the silicone boundary 127 surrounds and protects the edges of the flexible display 104 when the flexible display 104 is attached to the blade substrate 125 of the blade assembly 102. In one or more embodiments, the silicone boundary 127 is co-molded around the perimeter of the blade substrate 125.
In one or more embodiments, the rigid portion 811 of the blade substrate 125 may define one or more apertures. These holes may be used for various purposes. By way of illustration, some holes may be used to rigidly secure the blade assembly 102 to a translation mechanism, one example of which is the display roller mechanism (105) of fig. 1. Additionally, some of the holes may contain magnets. Hall effect sensors positioned in a single device housing (101) coupled with the blade assembly 102 may then detect the position of these magnets so that the one or more processors (114) may determine whether the blade assembly 102 and the flexible display 104 are in the extended position (200), the retracted position (300), the peeping position (500), or somewhere in between.
In one or more embodiments, the flexible display 104 is coupled to the blade substrate 125 of the blade assembly 102 within the confines of the silicone boundary 127. By way of illustration, in one or more embodiments, the first end of the flexible display 104 is adhesively coupled to the rigid portion 811 of the blade substrate 125 of the blade assembly 102. The other end of the flexible display 104 may then be rigidly coupled to the tensioner by passing fasteners through the holes 809 of the flexible substrate.
Turning now to fig. 9, illustrated therein is a blade substrate 125 and silicone boundary 127 shown in an exploded view. As shown, the silicone boundary 127 defines a single, continuous, unitary piece of silicone. In the illustrative embodiment of fig. 9, the silicone boundary 127 surrounds three sides 901, 902, 903 of the blade substrate 125 and extends beyond the minor side 904 to define a receiving recess 905 that can house mechanical and electrical components such as electronic circuit components for powering and controlling the flexible display (104) that will be within the perimeter defined by the silicone boundary 127, tensioners, flexible circuits, and other components that keep the flexible display (104) flat across the flexible portion 810 of the blade substrate 125.
In this illustrative embodiment, portions 906, 907, 908 of the silicone boundary 127 extending beyond the minor side 904 of the blade substrate 125 surrounding the receiving recess 905 are thicker than other portions of the silicone boundary 127 that would surround the flexible display (104). This allows for placement of the component within the receiving recess 905.
Turning now to fig. 10, there is illustrated a flexible display 104 and a blade assembly 102, the blade assembly 102 having a silicone boundary 127 overmolded on a blade substrate 125. As shown, the silicone boundary 127 surrounds three sides 901, 902, 903 of the blade substrate 125 and extends beyond the minor side 904 to define a receiving recess 905 that can accommodate mechanical and electrical components.
Electronic circuitry 1001 operable to power and control the flexible display 104 has been coupled to the T-shaped tongue 807 of the flexible display layer (803). Additionally, mechanical connector 1002 has been attached to the top of the T on T-tongue 807. In this illustrative embodiment, the flexible substrate 805 extends beyond the distal end of the flexible display layer 803 such that the aperture 809 defined therein may be coupled to a tensioner to ensure that the flexible display 104 remains flat around the flexible portion 810 of the blade substrate 125 as the flexible portion 810 of the blade substrate 125 bypasses the rotor positioned at the end of the single device housing 101.
In one or more embodiments, the blade assembly 102 may be fixedly coupled to the flexible display 104. By way of illustration, where the blade substrate 125 defines a rigid portion 811 and a flexible portion 810, in one or more embodiments, the flexible display 104 is coupled to the rigid portion 811 by an adhesive or other coupling mechanism. The tensioner may then be positioned in the receiving groove 905. In one or more embodiments, the tensioner is rigidly coupled to the aperture 809 of the flexible substrate 805 with fasteners to keep the flexible display 104 flat across the flexible portion 810 regardless of how the flexible portion 810 bends around the minor surface of a single device housing or its corresponding rotor.
Turning now to FIG. 11, the flexible display 104 is illustrated after being coupled to the blade assembly 102. As shown, the silicone boundary 127 surrounds the flexible display 104, wherein the silicone boundary 127 surrounds and abuts three sides of the flexible display layer (803).
The flexible substrate is then connected to the electronic circuit 1001 carried by the T-tongue 807. Additionally, a tensioner may be coupled to the flexible substrate 805. Thereafter, the cover 1101 is attached to the silicone border 127 on the electronic circuit 1001 and other components located on or around the T-shaped tongue. The portion of the blade assembly 102 where the components are stored under the cover 1101 is effectively referred to as a "backpack". Turning to fig. 12, therein is illustrated the blade assembly 102 with its backpack 1201 fully deployed.
In one or more embodiments, the flexible display 104 and the blade assembly 102 are configured to surround a minor surface of a device housing in which the display roller mechanism is located. In one or more embodiments, the display roller mechanism includes a rotor positioned within the curved portion of the flexible display 104 and the blade assembly 102. When placed within the device housing of the electronic device, translation of the translation mechanism translates the blade assembly 102, which in turn results in rotation of the rotor. The result is a linear translation of the flexible display 104 and the blade assembly 102 across the translation surface of the device housing by pulling the flexible display 104 and the blade assembly 102 around the rotor.
The blade substrate (125) of the blade assembly 102 includes a flexible portion (810) that allows the blade assembly 102 and the flexible display 104 to be deformed around a device housing, one example of which is the single device housing (101) of fig. 1. By way of illustration, turning now to fig. 13-14, a flexible display of a blade assembly 102 and deformation to create a curve segment 1301 and two linear segments 1302, 1303 is illustrated. The flexible display 104 and the blade assembly 102 are shown as they are in the retracted position 300 in fig. 13. The flexible display 104 and the blade assembly 102 are shown as they are in the extended position 200 in fig. 14. The enlarged view 1401 of fig. 14 shows how the aperture defined by the chemical etching of the blade substrate 125 easily allows the blade substrate 125 to bend around the curved section 1301 while maintaining a rigid support structure in the two linear sections 1302, 1303 underneath the flexible display 104.
In one or more embodiments, the first linear segment 1302 and the second linear segment 1303 are configured to slide between the retracted position 300 of fig. 13 and the extended position 200 of fig. 14. The flexible display 104 is coupled to the blade assembly 102 and, thus, translates with the blade assembly 102 along a translation surface defined by the device housing of the electronic device.
In one or more embodiments, the linear segments 1302, 1303 of the blade assembly 102 are positioned between the flexible display 104 and the translating surface. The rotor is then positioned within curved section 1301 of blade assembly 102. As the translation mechanism moves the linear segments 1302, 1303 of the vane assembly 102 across the translation surface defined by the device housing, the rotor rotates with the flexible portion 810, which flexible portion 810 passes along the rotor while the rotor rotates.
As shown in fig. 13-14, in one or more embodiments, the cross-section of the blade assembly 102 and flexible display 104 define a J-shape, wherein the curved portion of the J-shape defined by curved segment 1301 is configured to surround the rotor and the upper portion of the J-shape defined by linear segment 1302 passes through a translation surface defined by the device housing. As the translator of the translation mechanism drives the blade assembly 102, the upper portion of the J-shape becomes longer as the flexible display 104 translates around the rotor, with the blade assembly 102 extending further from the device housing. This can be seen in fig. 13-14 by comparing the extended position 200 of fig. 14 with the retracted position 300 of fig. 13.
When the translator of the translation mechanism drives the blade assembly 102 in an opposite direction, e.g., driving the blade assembly 102 from the extended position 200 of fig. 14 to the retracted position 300 of fig. 13, the upper portion of the J-shape becomes shorter as the reversal operation occurs. Thus, when the translation mechanism drives the blade assembly 102 carrying the flexible display 104, the flexible display 104 deforms at different locations as it wraps around and bypasses the rotor.
It should be appreciated that the more conventional "J-shape" is primarily defined when the blade assembly 102 transitions to the extended position 200 of FIG. 14. Depending on the length of the blade assembly 102 and flexible display 104, the J-shape may also transition to other shapes, including a U-shape in which the upper and lower portions of the blade assembly 102 and/or flexible display 104 are substantially symmetrical, in combination with the amount by which the translation mechanism may cause the blade assembly 102 to slide around the rotor. Such a U-shape is formed when the blade assembly is in the peeping position but is formed substantially in the retracted position 300 of fig. 3. In other embodiments, depending on the configuration, the blade assembly 102 may even transition into an inverted J-shape, wherein an upper portion of the blade assembly 102 and/or flexible display 104 is shorter than a lower portion of the blade assembly 102 and/or flexible display 104, and so forth.
In one or more embodiments, the translator and rotor of the translation mechanism not only facilitate "extension" of the flexible display 104 that occurs during an extension or "ascent" operation, but also serve to improve the reliability and usability of the flexible display 104. This is true because the rotor defines a service loop 1304 in curve segment 1301 that has a relatively large radius compared to the minimum bend radius of flexible display 104. The service loop 1304 prevents memory damage or development of the flexible display 104 in the curved state that occurs when the flexible display 104 defines the curved segment 1301 around the rotor in the extended position 200, the retracted position 300, and the peeping position (500).
With such a mechanical assembly, the flexible display 104 maintains a flat upper portion of the J-shape defined by the first linear segment 1302 when slid. Additionally, the flexible display 104 tightly surrounds the rotor, wherein the lower portion of the J-shape defined by the second linear segment 1303 also remains flat against the lower surface of the device housing. The blade assembly 102 and tensioner assembly, which are rigidly fixed to the translation mechanism, prevent the flexible display 104 from creasing or bunching as it slides around the device housing between the extended position 200, the retracted position 300, and the peeping position (500). This rigid coupling in combination with the moving tensioner ensures a straight and true translation of the flexible display 104 across the first major surface of the electronic device, a straight and true translation of the flexible display 104 around the rotor of the electronic device positioned at the minor surface of the device housing, and a straight and true translation of the flexible display 104 across the second major surface of the electronic device.
In one or more embodiments, additional support members may be attached to the blade assembly 102 to provide one or more of additional support to the flexible display 104, facilitate translation of the blade assembly 102 about the device housing, or a combination thereof.
As described above, in one or more embodiments, the blade assembly 102 is coupled to the flexible display 104. In contrast to a sliding device that includes multiple device housings, embodiments of the present disclosure provide an electronic device having a sliding display that is included only on the device housings. The blade assembly 102 is configured as a mechanical chassis that allows the flexible display 104 to translate along a translation surface defined by the major and minor surfaces of the single device housing.
In one or more embodiments, the blade assembly 102 also provides mechanical support for the portion of the flexible display 104 that extends beyond the top edge of the single device housing when the blade assembly 102 and flexible display 104 are in the extended position. The blade assembly 102 may include a blade base plate (125), the blade base plate (125) being integral but defining a flexible portion and a rigid portion. The blade substrate (125) may include a silicone boundary 127 that surrounds and protects the edges of the flexible display 104.
The low friction dynamic bending lamination stack (128) and the blade (126) may be positioned between the blade assembly 102 and a translation surface defined by the single equipment housing (101). In one or more embodiments, the blade (126) and the low friction dynamic bending lamination stack (128) are positioned between the blade assembly 102 and a translation surface defined by the equipment housing to which the blade assembly 102 is attached.
The blade (126) supports the blade assembly 102 and a portion of the flexible display 104 that extends beyond the top edge of the device housing when the blade assembly 102 transitions to the extended position. Since the blade (126) needs to be rigid to support those portions of the blade assembly 102 and the flexible display 104, it cannot bend around the flexible portion of the blade substrate (125) of the blade assembly 102. To prevent gaps or steps from occurring where the blade (126) terminates, in one or more embodiments, the low friction dynamic bending lamination stack (128) spans the remainder of the blade assembly 102 and abuts a transition surface defined by a single equipment enclosure.
In one or more embodiments, the blade (126) includes a steel layer. In one or more embodiments, the thickness of the blade (126) is greater than the thickness of the blade substrate (125) of the blade assembly 102 or the flexible substrate (805) of the flexible display 104. By way of example, in one or more embodiments, the blade (126) includes a steel layer having a thickness of 500 microns or 0.5 mil.
In one or more embodiments, the blade (126) includes a rigid, substantially planar support layer. By way of example, in one or more embodiments, the blade (126) may be fabricated from aluminum, steel, or stainless steel. In another embodiment, the blade (126) is fabricated from a rigid thermoplastic sheet. Other materials may also be used in manufacturing the blade substrate (125). For example, nitinol may also be used to fabricate the blade (126).
In one or more embodiments, the blade (126) is the hardest layer of the overall assembly. In one or more embodiments, the blade (126) is fabricated from stainless steel having a thickness of about 500 microns. In another embodiment, the blade (126) is made of carbon fiber. Other materials from which the blades (126) may be fabricated will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the low friction dynamic bending lamination stack (128) includes a plurality of layers. When assembled, the low friction dynamic bending lamination stack (128) adds a layer to the blade assembly 102 that enhances the lubricity of the entire assembly to allow for smooth movement of the blade assembly 102 and flexible display 104 across the translating surface of the device housing. Furthermore, when abutting the blade (126), the low friction dynamic bending lamination stack (128) prevents features on other layers of the assembly from reducing the ability of the blade assembly 102 and flexible display 104 to translate across these translating surfaces.
In one or more embodiments, the low friction dynamic bending lamination stack (128) incorporates the ability to cyclically bend and/or roll around the rotor to allow "low friction" sliding across a stationary surface. In one or more embodiments, the low friction dynamic bending lamination stack (128) interfaces with the blade (126) and abuts the blade (126) to improve lubricity.
In one or more embodiments, the uppermost layer of the low friction dynamic bending lamination stack (128) is a pressure sensitive adhesive layer. The pressure sensitive adhesive layer allows the low friction dynamic bending lamination stack (128) to adhere to the underside of the blade assembly 102.
In one or more embodiments, below the pressure sensitive adhesive layer is a strain resistant foam layer. Examples of strain resistant foams suitable for use as the strain resistant foam layer include silicone, low density polyethylene, or other materials that provide sufficient thickness to allow the low friction dynamic bending lamination stack (128) to match the thickness of the blade (126) while reducing internal stresses and allowing bending.
In one or more embodiments, below the strain resistant foam layer is another pressure sensitive adhesive layer. The pressure sensitive adhesive layer is coupled to a flexible substrate having a strain relief kerf pattern formed therein. The flexible substrate may be made of metal or plastic or other materials. By way of illustration, in one or more embodiments, the flexible substrate includes a steel layer having a thickness of about 30 microns. While thin flexible steel works well in practice, other materials may also be used for the flexible substrate as will be apparent to those of ordinary skill in the art having the benefit of this disclosure. For example, in another embodiment, the flexible substrate is fabricated from a thin layer of thermoplastic material.
Then, in one or more embodiments, another layer of pressure sensitive adhesive couples the flexible substrate to the low friction layer. In one or more embodiments, the low friction layer includes a substrate having a Teflon.sup.TM attached thereto. In another embodiment, the low friction layer comprises a polytetrafluoroethylene layer that is a synthetic fluoropolymer of tetrafluoroethylene. Such materials are well known for their non-stick properties and add lubricity to the low friction dynamic bending lamination stack (128), which allows the entire assembly to slide smoothly. Furthermore, the low friction layer prevents the strain relief cut pattern in the flexible substrate from seizing on surface imperfections and from transitioning on the device housing to which the component is attached. In short, the low friction layer greatly improves the lubricity of the overall assembly.
Fig. 15-20 illustrate the electronic device 100 of fig. 1 fully assembled in an extended position 200 and a retracted position 300. Embodiments of the present disclosure contemplate that an electronic device configured in accordance with embodiments of the present disclosure has significantly unique decorative features in addition to significantly unique practical features. Many of these decorative features are visible in fig. 15-20.
Fig. 15 illustrates a front view of the electronic device 100 in the extended position 200, while fig. 16 illustrates a side view of the electronic device 100 in the extended position 200. Fig. 17 then also provides a rear view of the electronic device 100 in the extended position 200.
Fig. 18 illustrates a front view of the electronic device 100 in the retracted position 300, while fig. 19 illustrates a side view of the electronic device 100 in the retracted position 300. Fig. 20 then provides a rear view of the electronic device 100 in the retracted position 300.
As can be seen by comparing these figures, the blade assembly 102 is capable of sliding around the single device housing 101 such that the blade 126 slides away from the single device housing 101 to change the overall length on the surface of the flexible display 104 as viewed from the front of the electronic device 100. The blade assembly 102 may also be slid in opposite directions around the single device housing 101 to the retracted position 300, wherein a similar amount of flexible display 104 is visible on the front side of the electronic device 100 and the rear side of the electronic device 100. Graphics, images, user actuation targets, and other indicia may be presented anywhere on the flexible display 104, including on the front side of the electronic device 100, the back side of the electronic device 100, or the lower end of the electronic device 100.
While much attention has been paid to this regard to the unique translation of the blade assembly and flexible display between the extended and retracted positions, one of the other truly unique features provided by embodiments of the present disclosure appears when the blade assembly and flexible display transition to the peeping position. Turning now to fig. 21-22, the electronic device 100 is illustrated in the peeping position 400.
As shown in fig. 21, in one or more embodiments, when the blade assembly 102 and flexible display 104 transition to the peeping position 500, the backpack 1201 moves toward the rear image capture device 108 toward the out-of-retracted position (300). When this occurs, the upper edge 2101 of the blade assembly 102 moves below the upper edge 2102 of the single device housing 101. In one or more embodiments, this reveals the front image capture device 501, which image capture device 501 is located below the blade assembly 102 when the blade assembly 102 is in the retracted position (300).
In one or more embodiments, translation of the blade assembly 102 and flexible display 104 to the peeping position 500 occurs automatically. By way of illustration, in one or more embodiments, when the front image capture device 501 is actuated, the one or more processors (114) of the electronic device 100 cause the blade assembly 102 to translate to the peep position 500, thereby revealing the image capture device 501. (in the illustrative embodiment of FIGS. 21-22, speaker 502 is also disclosed). Once the image capture operation with the image capture device 501 is completed, the one or more processors (114) may cause the blade assembly 102 to transition back to the retracted position, which again covers and obscures the image capture device 501.
In other embodiments, the transition to the peep position 500 is initiated manually by actuation of a button or other user interface control. For example, a single press of the button 2103 may transition the blade assembly 102 to the extended position (200), while a double press of the button 2103 may return the blade assembly 102 to the retracted position (300). Long presses of button 2103 may cause blade assembly 102 to transition to peep position 500 of fig. 5, and so on. Other modes of button operation will be apparent to those of ordinary skill in the art having the benefit of this disclosure. In other embodiments, the transfer of user input to the flexible display 104 in the form of a swipe gesture may also be used to cause a transition to the peep position 500.
By positioning the front image capture device 501 under the blade assembly 102 and its corresponding opaque blade (126) during normal operation, embodiments of the present disclosure provide privacy assurance to a user of the electronic device 100. In other words, by positioning the image capturing device 501 under the blade assembly 102 and the flexible display 104 when in the retracted position (300) or the extended position (200), the user of the electronic device 100 is mechanically ensured privacy due to the fact that the image capturing device 501 is physically impossible to perform image capturing operations through the blades (126) of the blade assembly 102.
Thus, even if the electronic device 100 is accessed by a hacker or other bad actor, the user may ensure that the image capture device 501 cannot capture images or video when the blade assembly 102 and flexible display 104 are in the retracted position (300), the extended position (200), or a position therebetween. Only when the blade assembly 102 and flexible display 104 transition to the peeping position 500, thereby revealing the image capturing device 501, the image capturing device 501 may capture a front image or front video.
Attention is now directed to a method of using the electronic device described above, and more particularly, to automatic movement of the flexible display 104 and blade assembly 102 in response to user input in the form of gestures in accordance with one or more embodiments of the present disclosure. Turning now to fig. 23, illustrated therein is one illustrative method 2300 in accordance with one or more embodiments of the present disclosure. The method 2300 of fig. 23 is intended for use in an electronic device having a device housing, a blade assembly carrying a blade and a flexible display, wherein the blade assembly is slidably coupled to the device housing, a translation mechanism is operable to slide the blade assembly relative to the device housing between at least an extended position and a retracted position, and one or more processors are operable with the translation mechanism.
The decision 2301 determines whether the electronic device is in a locked state or an unlocked state. As will be explained below, in some embodiments, it is undesirable to cause translation of a blade assembly coupled to the flexible display in response to user input communicated to the flexible display when the electronic device is in a locked mode of operation. Thus, in some embodiments, any panning in response to user input occurs only when the electronic device is in an unlocked state as determined by decision 2301 when user input is detected.
Step 2302 then includes detecting user input with a flexible display carried by a blade assembly slidably coupled to the device housing and movable between at least an extended position and a retracted position. In one or more embodiments, the user input detected at step 2302 includes a swipe gesture.
In one or more embodiments, the sweeping gesture has a directional component associated therewith. By way of illustration, in one or more embodiments, an initial contact position of a user input defines a start of a swipe gesture, wherein movement of a finger, stylus, or other object along the flexible display in a direction from the initial contact point defines a direction of the swipe gesture. This defines the end of the swipe gesture when a finger, stylus or other device ceases to contact the flexible display. Thus, the swipe gesture may be directed toward the top of the electronic device, toward the bottom of the electronic device, side-by-side, or in other directions. In one or more embodiments, a translation mechanism operable with the blade assembly translates the blade assembly toward the extended position when the sweeping gesture is directed toward the tip of the flexible display, as will be described below. In contrast, when the swipe gesture is directed from the tip toward a curved portion of the flexible display opposite the front portion of the flexible display, the translating mechanism may translate the blade assembly toward the retracted position. In some embodiments, if the flexible display is in the retracted position when a swipe gesture directed to the curved portion of the flexible display is detected at step 2302, the translating mechanism may cause the flexible display to transition to the peeping position, or the like.
If step 2302 includes detecting a user input toward the tip of the flexible display, the user input being opposite from the front portion of the flexible display from the curved portion, step 2303 includes translating the blade assembly toward the extended position in response to the sweeping gesture by a translation mechanism operable with the blade assembly. As described above, in one or more embodiments, the translation of step 2303 occurs only when a user input defining a swipe gesture is detected at step 2302, as determined by decision 2301, when the electronic device is in an unlocked state.
In one or more embodiments, step 2304 then includes the one or more processors of the electronic device rendering content on the flexible display on the newly exposed front portion of the flexible display revealed by the translation occurring at step 2303. The content may take various forms.
In one or more embodiments, the one or more processors reveal an application tray (an example of which is shown below with reference to fig. 25) on the newly exposed front portion of the flexible display. In other embodiments, the one or more processors present additional application components on the newly exposed front portion of the flexible display, of which user interface controls are one example. By way of illustration, the additional application component may be a text message editor, an email composition user interface control, a media player window, or a game window. One example of the additional application components presented is shown below with reference to fig. 26. Other application components will be apparent to those of ordinary skill in the art having the benefit of this disclosure. Further, as shown in FIG. 27, notifications or other content items may also be presented on the newly exposed front portion of the flexible display.
In one or more embodiments, where the one or more processors operable with the flexible display present the home screen presentation on the front portion of the flexible display when the user interface is received at step 2302, step 2304 includes presenting additional content by the one or more processors on the front region of the flexible display newly exposed by the translation after the translation of 2303 occurs. In one or more embodiments, the additional content includes one or more user activation targets defining the application set 2321.
The new application set 2321 may take a variety of forms. In one or more embodiments, application set 2321 includes a set of user-defined applications 2313. These user-defined applications 2313 may be configured as virtual "trays" (tracks) of applications that open and are presented on the newly exposed front portion of the flexible display in response to the translation occurring at step 2303. The user-defined applications 2313 may be a set of applications that are preferences of a particular user or are included in the set for other reasons. Using menus and/or user interface controls of the electronic device, in one or more embodiments, the user may select the user-defined application 2313 to include in the application tray disclosed on the newly exposed front portion of the flexible display.
In other embodiments, the selection of the new application set 2321 may occur automatically. By way of illustration, one or more processors of the electronic device can select the initially selected application set 2314 for presentation in the application tray disclosed on the newly exposed front portion of the flexible display. The "initially selected" application set 2314 may be a set that a user typically participates in when initially using the electronic device during the day. For example, if a user reads WALL STREET journal < sup >, checks email, checks sleep trackers, and checks their brokerage account balance each morning when they wake up, in one or more embodiments, the initially selected collection 2314 may include WALL STREET journal < sup >, brokerage account, sleep tracking, email applications, and the like.
In other embodiments, the newly exposed application program set 2321 may include a frequently used subset of applications 2315. For example, if a user spends 90% of their production time in a word processor during a daily sketch of a document, a number is created in the illustrated application and corresponds to people via email, although these applications are not necessarily the user's preferences or applications that the user deems most interesting, one or more processors of the electronic device may combine them together as a frequently used subset of applications 2315, as they are often used. In one or more embodiments, after translation of the blade assembly, an application tray defined by the frequently used application subset 2315 is presented on the newly exposed front portion of the flexible display.
In other embodiments, the newly exposed application subset 2321 may include the most recently used application subset 2316. For example, if the user is working on a very long item with an imminent deadline, after unlocking the electronic device, as detected at decision 2301, and receiving user input defining a swipe gesture at step 2302, the one or more processors may reveal an application tray comprising a recently used subset of applications 2316, since the one or more processors predict that the user will again need to begin working on a very long item due to the upcoming imminent deadline.
In still other embodiments, the newly exposed subset of applications 2321 presented on the newly exposed front portion of the flexible display after the translation of step 2303 may include a time of day subset 2317 application. For example, if a person prefers to have a profound effect during lunch (meditate) and the swipe gesture detected by step 2302 occurs at 11:59 pm, the one or more processors may group the profound applications with other applications that are frequently used during lunch, such as a restaurant mobile pick-up application, a nap time white noise machine application, and a coffee machine control application that causes the coffee machine to make very strong coffee, and may present them as a time of day subset 2317 application on the newly exposed front portion of the flexible display in response to the swipe gesture. These examples of time of day subset 2317 applications are merely illustrative, as other applications will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In still other embodiments, the newly exposed application set 2321 may include a place subset 2318 of applications, optionally configured as an application tray that is exposed on the newly exposed front portion of the flexible display in response to the translation of step 2303. For example, if a user listens to jazz music while reading a prescription for a cocktail at home, but listens to heavy metal rock while creating an illustration at a workplace, the jazz music application and the mixed learner application may be grouped into a first subset of places 2318 for applications at home, while the heavy metal rock application and the illustration application are grouped into a second subset of places 2318 for applications at workplace, and so on.
In still other embodiments, the newly exposed application set 2321 may be a subset of applications related to the currently open application 2319. If the user is an author writing a vast majority of text in a word processor, who also uses the editing application to examine content in the word processing document that is not popular with their fan-group, the subset of applications associated with the currently open application 2319 may include the editing application (and other corresponding applications) when the foreground application is a word processor, and so on.
In still other applications, where the newly exposed front portion of the flexible display after translation in step 2303 provides sufficient space, the newly exposed application set 2321 may include all applications 2320 installed on the electronic device. The above examples of application set 2321 are merely illustrative, as many other applications will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, step 2303 further includes determining, by the one or more processors of the electronic device, additional display area required for presenting the application set 2321. Thus, step 2305 may include interrupting the translation of step 2303 when additional display area required for rendering application set 2321 is uncovered. It should be noted that the translation of the presentation application set 2321 and the interrupting step 2305 in step 2304 may occur in any order. Before the one or more processors present the application set 2321 at step 2304, the translation mechanism may stop translation at step 2305 and vice versa, wherein the application set 2321 is revealed as the blade component translates.
Then, a decision 2306 determines if additional user input defining another swipe gesture is detected. This may be the case, for example, where the set of applications 2321 presented at step 2304 includes some applications but not those required by the user, the user may pass another swipe gesture to reveal additional applications. Decision 2306 detects whether this has occurred. In the case of yes, step 2307 includes translating the blade assembly toward the extended position, again in response to the additional sweeping gesture, by the translating mechanism. Step 2308 may then include presenting another set of applications 2322 on another front region of the flexible display that is newly exposed in response to additional user input. Otherwise, method 2300 moves to decision 2309.
At decision 2309, method 2300 determines whether the user actuation target disclosed at step 2304 or step 2308 is actuated. In the case of yes, then step 2310 launches the application. However, embodiments of the present disclosure contemplate that the user may change his mind and may not choose to actuate any user actuation targets. The user may instead decide to go to noon nap and want to put the electronic device back in the retracted position. Decision 2311 determines whether or not the additional swipe has been received. In one or more embodiments, the sweep determined at decision 2311 occurs in the opposite direction as the sweep detected at step 2302. If so, step 2312 translates the blade assembly back to the initial position, at which point method 2300 begins. In one or more embodiments, the position is a retracted position.
While embodiments of the present disclosure work aesthetically to reveal an application tray or other content presentation of the application set 2321 depicted in fig. 23, embodiments of the present disclosure contemplate that this is merely one way to present content on a newly revealed front portion of a flexible display after panning in response to detecting a swipe gesture. Turning now to fig. 24, another method 2400 is illustrated showing another implementation embodiment of the present disclosure, the method 2400 can be used to detect a sweeping gesture, translate a blade assembly carrying a flexible display, and then present new content on a front portion of the flexible display that is revealed by the translation. In particular, the method 2400 of fig. 24 is directed to a case in which a foreground application is operating on one or more processors of an electronic device. This is in contrast to the home screen presentation described above with reference to method (2300) of fig. 23.
As previously described, decision 2401 determines whether the electronic device is in a locked state or an unlocked state. In some embodiments, any panning responsive to user input occurs only when the electronic device is in an unlocked state as determined by decision 2401 when user input is detected.
Step 2402 then includes detecting the user input with a flexible display carried by a blade assembly slidably coupled to the device housing and movable between at least an extended position and a retracted position. In one or more embodiments, the user input detected at step 2402 includes a swipe gesture. As described above, the sweeping gesture may have a directional component associated therewith. The directional component may be made toward the top of the electronic device, toward the bottom of the electronic device, side-by-side, or in other directions.
If step 2402 includes detecting a user input directed to a top end of the flexible display, the flexible display opposing a front portion of the flexible display from the curved portion, step 2403 may include determining, by one or more processors of the electronic device, an additional display area required by an operational state of the application operating on the one or more processors for the additional content. While the example described with reference to method (2300) of fig. 23 assumes that the home screen presentation is being presented on the front portion of the flexible display when a swipe gesture is received, method 2400 of fig. 24 assumes that the active application interface presentation of the active application is being presented on the front portion of the flexible display when a swipe gesture is detected at step 2402.
By way of illustration, if a user is browsing social media feeds in a social media application, a graphical user interface or user interface portal of the social media application may define an activity application interface presentation of the social media application when the social media application is operating in a foreground on one or more processors. Step 2403 may include querying the active application to determine what additional application content is needed based on the operational state.
Step 2404 may include translating the blade assembly toward the extended position in response to the sweeping gesture by a translation mechanism operable with the blade assembly. As described above, in one or more embodiments, the translation of step 2404 occurs only when a user input defining a swipe gesture is detected at step 2402, as determined by decision 2401, when the electronic device is in an unlocked state.
Step 2404 may also include the one or more processors presenting additional active application content in an additional front portion of the flexible display revealed by the panning after the panning occurs. The new disclosure presented at step 2403 may take various forms.
While the method (2300) of fig. 23 reveals an application tray (an example of which is shown below with reference to fig. 25) on the newly exposed front portion of the flexible display, the method 2400 of fig. 24 reveals additional application components on the newly exposed front portion of the flexible display, an example of which is a user interface control. By way of illustration, the additional application component may be a text message editor, an email composition user interface control, a media player window, or a game window. One example of the additional application components presented is shown below with reference to fig. 26. Other application components will be apparent to those of ordinary skill in the art having the benefit of this disclosure. It should again be noted that notifications or other content items may also be presented on the newly exposed front portion of the flexible display, as shown in fig. 27.
In one or more embodiments, where the one or more processors operable with the flexible display when the user interface is received at step 2402 present an active application interface presentation of the active application on a front portion of the flexible display, step 2404 includes presenting, by the one or more processors, additional content on a front region of the flexible display newly exposed by the translation of step 2404 after the translation occurs. In one or more embodiments, the additional content includes additional campaign application content.
The additional campaign application content may take a variety of forms. In one or more embodiments, the additional active application content includes a user interface control 2413. By way of example, if the active application is a text messaging application and the user is reading a text message, the user may decide to respond to the particular message. If this occurs, the user may communicate the swipe gesture detected at step 2402 to cause a drawing or editing user interface control to appear on the flexible display. An example of this is shown below with reference to fig. 26.
In other embodiments, the additional active application content may include a new tab 2416 or a new application window 2414. For example, if the user is using a web browsing application to surf the web, additional active application content may include a new tab 2416 or window 2414, which may be further on additional portions of the web surfing. In contrast, if the user is browsing photos in the photo application, the additional active application content may include new application content 2415 in the form of additional pictures or videos.
If the active application is a video conferencing application, the additional active application content may take the form of additional communication window 2418. In contrast, if the active application includes search features, such as many word processors, spreadsheets, and other applications, the additional active application content may include a search function 2419. Editing function 2420 may be presented in a similar manner.
If the activity application is a gaming application, the additional activity application content may be a game window 2421 where the user may participate in the gaming activity. If the active application is a multimedia application, such as a video streaming application, the additional active application content may include a media player 2422. These are merely examples, as many other additional active application content examples will be apparent to those of ordinary skill in the art having the benefit of this disclosure. An editing window serving as a content generation window 2417 may also be presented.
In one or more embodiments, the amount of translation that occurs at step 2404 occurs as a function of an amount defined by the function of the additional display area determined at step 2403. Thus, step 2405 may include interrupting the translation of step 2404 when additional display area required for presenting additional active application content is revealed. It should be noted that as previously described with reference to application set (2321) of fig. 23, additional active application content is presented at step 2404 and the translation of interrupt step 2405 may occur in either order.
Then, decision 2406 determines whether additional user input is detected that defines another swipe gesture. For example, this may be the following: the additional active application content presented at step 2404 includes a search result list. The user may wish to scroll through the search results without losing the earlier search results by causing the leaf assembly to continue to translate toward the extended position. Decision 2406 detects whether this has occurred. In the case of yes, step 2407 again determines from the active application how much area is needed for the current application activity. Step 2408 then includes translating, by the translation mechanism, the blade assembly toward the extended position, again in response to the additional sweeping gesture. Step 2408 may further include: another additional active application content presentation is presented on another front area of the newly exposed flexible display in response to the additional user input. Otherwise, method 2400 moves to decision 2409.
At decision 2409, method 2400 determines whether the additional active application content disclosed at step 2404 or step 2408 was interacted with by the user. In the yes case, step 2410 maintains the new position of the blade assembly because the user is actively engaged in additional active application content. However, embodiments of the present disclosure contemplate that the user may change his mind and may not choose to participate in additional activity application content. The user may instead decide to continue to interact with the original active application interface presentation. Decision 2411 determines whether the additional swipe has been received. In one or more embodiments, the additional swipe occurs in the opposite direction of the swipe gesture detected at step 2402. If so, step 2412 translates the vane assembly back to the initial position, at which point the method 2400 begins. In one or more embodiments, the position is a retracted position.
Having now described the general method of the present disclosure, more specific examples will be noted to illustrate how different embodiments operate in different contexts. Turning now to fig. 25, illustrated therein is an illustrative method 2500 for presenting an application set (2321) configured as an application tray 2506 in response to a swipe gesture 2507.
Beginning at step 2501, an electronic device 100 having a device housing 101 and a blade assembly 102 carrying a blade and a flexible display 104 is illustrated. The blade assembly 102 is slidably coupled to the device housing 101. The translation mechanism is operable to slide the blade assembly 102 relative to the device housing 101 between an extended position, a retracted position, and a peeping position. At step 2501, the blade assembly 102 is in a retracted position. Further, one or more processors of the electronic device 100 operable with the translation mechanism present a home screen presentation 2508 on a front portion of the flexible display 104. Additionally, the electronic device 100 is in an unlocked state.
At step 2501, a user 2520 communicates a swipe gesture 2507 to the flexible display 104 while one or more processors of the electronic device 100 are presenting a home screen presentation 2508. In this example, the sweeping gesture 2507 is an "up" sweeping gesture in that it begins to move near the curved portion of the flexible display 104 and toward the top end of the device housing 101 and blade assembly 102. At step 2502, one or more processors of the electronic device 100 detect the swipe gesture 2507.
At step 2503, the one or more processors cause the translation mechanism to translate the blade assembly 102 toward the extended position. At step 2503, the one or more processors present additional content on a front portion of the flexible display 104 that is revealed by the translation of the flexible display 104. At step 2504, the one or more processors cause the translation mechanism to interrupt translation of the flexible display 104 when a sufficient amount 2509 of the flexible display 104 is revealed on a front portion of the electronic device 100. Steps 2503 and 2504 may be performed in either order.
In the illustrative embodiment, the additional content presented at step 2503 includes one or more user actuation targets, e.g., user actuation target 2510, corresponding to applications operable on one or more processors, as shown at step 2505. In the illustrative embodiment, the one or more user actuation targets are configured as an application tray 2506, the application tray 2506 being revealed when the flexible display 104 receives a swipe gesture 2507. In one or more embodiments, the application tray 2506 is presented when the flexible display 104 presents the home screen presentation 2508 upon receiving the swipe gesture 2507.
In the illustrative embodiment of fig. 25, only six user actuation targets are included in the application tray 2506. However, multiple applications are installed on the electronic device 100 and are operable with one or more processors. Thus, the applications included in the application tray 2506 define a subset of the available applications of the electronic device. If the user 2520 wishes to see the additional application, in one or more embodiments, the user 2520 may communicate another swipe gesture to cause the blade assembly 102 to translate further toward the extended position, wherein additional user actuation targets corresponding to the additional application are revealed. This process may continue until the blade assembly 102 reaches the extended position, wherein additional swipe gestures may cause a user actuation target corresponding to an application operable on one or more processors to scroll.
Turning now to fig. 26, another method 2600 is illustrated showing different content presentations. The method 2600 operates primarily in the same manner as the method (2500) of fig. 25. However, at step 2501, instead of initially presenting the home screen presentation (2508), the flexible display 104 presents the active application interface 2608 of the active application. In this example, the active application program includes a text message application.
Step 2601 illustrates the electronic device 100 having a device housing 101 and a blade assembly 102 carrying a blade and a flexible display 104. The blade assembly 102 is slidably coupled to the device housing 101. The translation mechanism is operable to slide the blade assembly 102 relative to the device housing 101 between an extended position, a retracted position, and a peeping position. At step 2601, the blade assembly 102 is in a retracted position. Further, one or more processors of the electronic device 100 operable with the translation mechanism present an active application interface 2608 of an active application on a front portion of the flexible display 104. Additionally, the electronic device 100 is in an unlocked state.
At step 2601, when one or more processors of the electronic device 100 are presenting an active application interface 2608 of an active application, the user 2520 again communicates a swipe gesture 2507 to the flexible display 104. In this example, the sweeping gesture 2507 is an "up" sweeping gesture in that it begins to move near the curved portion of the flexible display 104 and toward the top end of the device housing 101 and blade assembly 102.
At step 2602, the method 2600 detects a sweeping gesture 2507 using the flexible display 104 carried by the blade assembly. Since one or more processors are presenting the active application interface 2608 of the active application, at step 2603, the one or more processors determine additional display area required by the operational state of the active application. In this example, user 2520 wants to respond to one of the text messages. Thus, additional display areas are required to present the user interface controls 2606 and the editing window 2607. At step 2603, the one or more processors determine an amount of area required for the user interface control 2606 and the editing window.
Step 2603 includes translating the blade assembly 102 toward the extended position with a translation mechanism in response to the sweeping gesture 2507. Step 2604 includes interrupting the panning when the amount of area required for the user interface control 2606 and the editing window is revealed on the front portion of the electronic device 100 at step 2603. As shown in step 2605, the application content is newly presented on the flexible display 104. In this illustration, the application content includes a user interface control 2606 and an editing window at step 2603. However, other examples of application content, as well as other applications, will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
While some embodiments of the present disclosure translate the blade assembly 102 in response to the swipe gesture 2507 only when the electronic device 100 is in the unlocked state, this is not necessarily always the case. Embodiments of the present disclosure contemplate that when electronic device 100 is also in the locked state, user 2520 may want to translate blade assembly 102 in response to sweeping gesture 2507. By way of illustration, the user 2520 may want to see additional notifications that have been received without unlocking the electronic device 100. Turning now to fig. 27, one method 2700 is illustrated that shows how this may occur.
Beginning at step 2701, an electronic device 100 is illustrated having a device housing 101 and a blade assembly 102 carrying a blade and a flexible display 104. The blade assembly 102 is slidably coupled to the device housing 101. The translation mechanism is operable to slide the blade assembly 102 relative to the device housing 101 between an extended position, a retracted position, and a peeping position. At step 2701, the vane assembly 102 is in a retracted position. Further, one or more processors of the electronic device 100 operable with the translation mechanism present a lock screen presentation 2708 on a front portion of the flexible display 104. Additionally, the electronic device 100 is in a locked state.
At step 2701, the user 2520 communicates a swipe gesture 2507 to the flexible display 104 while one or more processors of the electronic device 100 are presenting a lock screen presentation 2708. At step 2702, the one or more processors of the electronic device 100 detect the swipe gesture 2507.
At step 2703, the one or more processors cause the translation mechanism to translate the blade assembly 102 toward the extended position. At step 2703, the one or more processors present additional content on a front portion of the flexible display 104 that is revealed by the translation of the flexible display 104. At step 2704, the one or more processors cause the translation mechanism to interrupt translation of the flexible display 104 when a sufficient amount of the flexible display 104 is revealed on the front portion of the electronic device 100.
In this illustrative embodiment, as shown at step 2705, the additional content presented at step 2703 includes one or more notification messages 2706, 2707 that are hidden from a front portion of the flexible display 104 before panning occurs at step 2703. Advantageously, the user 2520 may simply pass the swipe gesture 2507 to the flexible display 104 to expand it, thereby providing additional area on the front portion of the flexible display 104 for more content, whether or not the content is one or more notification messages 2706, 2707, user interface controls, application content, application tray, user actuation targets, or other content that were hidden from the front portion of the flexible display 104 prior to the panning occurring at step 2703.
Turning now to fig. 28, various embodiments of the present disclosure are illustrated therein. The embodiment of fig. 28 is shown as a label box in fig. 28, as the individual components of these embodiments have been described in detail in fig. 1 to 27 (prior to fig. 28). Thus, since these items have been illustrated and described previously, their repeated descriptions are not necessary for a proper understanding of the embodiments. Thus, the embodiments are shown as label boxes.
At 2801, a method in an electronic device includes detecting a user input defining a swipe gesture with a flexible display carried by a blade assembly slidably coupled to a device housing and movable between at least an extended position and a retracted position. At 2801, the method includes translating, by a translation mechanism operable with the blade assembly, the blade assembly toward an extended position in response to the sweeping gesture. At 2802, a translation of 2801 occurs only when the electronic device is in an unlocked state when a user input defining a swipe gesture is detected.
At 2803, the method of 2802 further includes presenting, by one or more processors operable with the flexible display, a home screen presentation on a front portion of the flexible display while receiving user input. At 2803, the method includes, after the translating occurs, presenting, by the one or more processors, additional content on a front area of the flexible display exposed by the translating.
At 2804, the additional content of 2803 includes one or more user actuation targets defining a set of applications. The application set at 2805, 2804 includes a set of user defined applications.
At 2806, the application set of 2804 includes a set of frequently used applications. At 2807, the set of applications for 2804 includes all applications available for operation on the one or more processors.
At 2808, the method of 2804 further includes detecting again with the flexible display additional user input defining another swipe gesture. At 2808, the method includes translating, by the translation mechanism, the blade assembly toward the extended position again in response to another sweeping gesture. At 2808, the method includes presenting, by the one or more processors, another set of applications on another front area of the flexible display that is newly exposed, in response to the additional user input.
At 2809, 2802 the method further includes receiving, by one or more processors operable with the flexible display, a presentation of the active application by presenting the active application on a front portion of the flexible display while the user input occurs. At 2809, the method includes determining, by the one or more processors, additional display areas required for the operational state of the active application. At 2809, the method includes, after the translating occurs, presenting, by the one or more processors, additional active application content in an additional display area revealed by the translating.
At 2810, a translation occurs 2809 by an amount defined by a function of the additional display area. The blade assembly of 2811, 2801 may also be moved to a peeping position revealing the image capture device. In one or more embodiments, this may be in response to a swipe gesture directed downward.
At 2812, an electronic device includes a device housing, a blade assembly carrying a blade and a flexible display and slidably coupled to the device housing, and a translation mechanism operable to slide the blade assembly relative to the device housing between an extended position, a retracted position, and a peeping position. At 2812, the electronic device includes one or more processors operable with a translation mechanism.
At 2812, in response to the flexible display detecting the sweeping gesture, the one or more processors cause the translation mechanism to translate the blade assembly toward the extended position. At 2812, the one or more processors also present additional content on a front portion of the flexible display that is revealed by the translation of the flexible display.
At 2813, the additional content of 2812 includes one or more user actuation targets corresponding to applications operable on the one or more processors when the flexible display is presenting a home screen presentation when the swipe gesture is detected at the flexible display. The applications at 2814, 2813 define a subset of available applications for the electronic device.
At 2815, the additional content of 2812 includes application content when the flexible display is presenting an active application interface of the active application when the flexible display detects a swipe gesture. At 2816, the application content of 2815 includes user interface controls of the active application. At 2817, the additional content of 2812 includes a notification message that is hidden from the front portion of the flexible display prior to the translation of the flexible display.
At 2818, a method in an electronic device includes detecting a swipe gesture with a flexible display carried by a blade assembly slidably coupled to a device housing and movable between an extended position, a retracted position, and a peeping position. At 2818, the method includes translating the blade assembly toward the extended position in response to the sweeping gesture with a translation mechanism.
At 2819, translation of 2818 occurs only when the electronic device is in an unlocked state when a swipe gesture is detected. The method at 2820, 2818 further includes: the additional content is presented with one or more processors on a front portion of the flexible display that is revealed by the panning.
In the foregoing specification, specific embodiments of the present disclosure have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Thus, while the preferred embodiments of the present disclosure have been illustrated and described, it will be clear that the present disclosure is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present disclosure as defined by the following claims.
The specification and figures are, accordingly, to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present disclosure. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims.

Claims (20)

1. A method in an electronic device, the method comprising:
Detecting a user input defining a sweeping gesture with a flexible display carried by a blade assembly slidably coupled to the device housing and movable between at least an extended position and a retracted position; and
Translating the blade assembly toward the extended position in response to the sweeping gesture by a translation mechanism operable with the blade assembly.
2. The method of claim 1, wherein the translating occurs only when the electronic device is in an unlocked state when the user input defining the swipe gesture is detected.
3. The method of claim 2, further comprising:
presenting, by one or more processors operable with the flexible display, a home screen presentation on a front portion of the flexible display while receiving user input; and
Additional content is presented by the one or more processors after the translation occurs on a front area of the flexible display exposed by the translation.
4. A method according to claim 3, wherein the additional content comprises one or more user actuation targets defining a set of applications.
5. The method of claim 4, wherein the set of applications comprises a set of user-defined applications.
6. The method of claim 4, wherein the set of applications comprises a set of frequently used applications.
7. The method of claim 4, wherein the set of applications includes all applications available for operation on the one or more processors.
8. The method of claim 4, further comprising:
Detecting again with the flexible display an additional user input defining another swipe gesture;
re-translating, by the translation mechanism, the blade assembly toward the extended position in response to the other sweeping gesture; and
In response to the additional user input, presenting, by the one or more processors, another set of applications on another front area of the flexible display that is newly exposed.
9. The method of claim 2, further comprising:
An active application interface presentation that presents an active application on a front portion of the flexible display while the user input occurs by one or more processors operable with the flexible display;
Determining, by the one or more processors, additional display areas required for the operational state of the active application; and
Additional active application content is presented by the one or more processors in the additional display area revealed by the panning after the panning occurs.
10. The method of claim 9, wherein the translating occurs by an amount defined by a function of the additional display area.
11. The method of claim 1, wherein the blade assembly is further movable to a peep position exposing the image capture device.
12. An electronic device, comprising:
An equipment housing;
A blade assembly carrying a blade and a flexible display and slidably coupled to the device housing;
A translation mechanism operable to slide the blade assembly relative to the device housing between an extended position, a retracted position, and a peeping position; and
One or more processors operable with the translation mechanism;
Wherein, in response to the flexible display detecting a swipe gesture, the one or more processors:
causing the translation mechanism to translate the blade assembly toward the extended position; and
Additional content is presented on a front portion of the flexible display that is revealed by the translation of the flexible display.
13. The electronic device of claim 12, wherein the additional content includes one or more user actuation targets corresponding to applications operable on the one or more processors when the flexible display is presenting a home screen when the swipe gesture is detected at the flexible display.
14. The electronic device of claim 13, wherein the application defines a subset of available applications of the electronic device.
15. The electronic device of claim 12, wherein the additional content includes application content when the flexible display is presenting an active application interface of an active application when the flexible display detects the swipe gesture.
16. The electronic device of claim 15, wherein the application content comprises a user interface control of the active application.
17. The electronic device of claim 12, wherein the additional content comprises a notification message hidden from a front portion of the flexible display prior to the translation of the flexible display.
18. A method in an electronic device, the method comprising:
Detecting a sweeping gesture with a flexible display carried by a blade assembly slidably coupled to the device housing and movable between an extended position, a retracted position, and a peeping position; and
Translating the blade assembly toward the extended position in response to the sweeping gesture with a translation mechanism.
19. The method of claim 18, wherein the translating occurs only when the electronic device is in an unlocked state when the swipe gesture is detected.
20. The method of claim 18, further comprising presenting, with one or more processors, additional content on a front portion of the flexible display that is revealed by the panning.
CN202310095484.1A 2022-10-17 2023-01-20 Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input Pending CN117908746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/114,663 US20240129392A1 (en) 2022-10-17 2023-02-27 Methods and Systems for Controlling a Translating Flexible Display of an Electronic Device in Response to Scrolling User Input

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/416,927 2022-10-17
US202263419994P 2022-10-27 2022-10-27
US63/419,994 2022-10-27

Publications (1)

Publication Number Publication Date
CN117908746A true CN117908746A (en) 2024-04-19

Family

ID=90691091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310095484.1A Pending CN117908746A (en) 2022-10-17 2023-01-20 Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input

Country Status (1)

Country Link
CN (1) CN117908746A (en)

Similar Documents

Publication Publication Date Title
JP6605000B2 (en) Approach for 3D object display
EP3805939A1 (en) Dialogue state determining method and device, dialogue system, computer device, and storage medium
US10545900B2 (en) Physical configuration of a device for interaction mode selection
US20240129392A1 (en) Methods and Systems for Controlling a Translating Flexible Display of an Electronic Device in Response to Scrolling User Input
CN117908746A (en) Method and system for controlling a translating flexible display of an electronic device in response to a scrolling user input
US20240129394A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
US11838433B1 (en) Electronic devices with translating flexible displays and corresponding methods for automatic transition to peek position
US20240126325A1 (en) Electronic Devices with Translating Flexible Displays and Corresponding Methods for Managing Display Position as a Function Content Presentation
US20240129393A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
US20240129391A1 (en) Electronic Devices with Translating Flexible Displays and Corresponding Methods for Managing Display Position as a Function Content Presentation
US20240129396A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
US20240126332A1 (en) Electronic Devices with Translating Flexible Displays and Corresponding Methods for Managing Display Position as a Function Content Presentation
US20240129398A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
US20240126345A1 (en) Electronic Devices with Translating Flexible Displays and Corresponding Methods for Automatic Transition to Peek Position
CN117914979A (en) Electronic device and method with translating flexible display for automatic position change
CN117912358A (en) Electronic device with translating flexible display and corresponding method
CN117912359A (en) Electronic device with translating flexible display and corresponding method
CN117912352A (en) Electronic device with translating flexible display and method of managing display position
CN117912354A (en) Electronic device with translating flexible display and method of managing display position
US20240126333A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
CN117912360A (en) Electronic device with translating flexible display and corresponding method
US20240126326A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
CN117912353A (en) Electronic device with translating flexible display and method of managing display position
US20240126334A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods
US20240129397A1 (en) Electronic Devices with Translating Flexible Display and Corresponding Methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication