CN113396379A - Interaction method, flexible electronic device and readable storage medium - Google Patents
Interaction method, flexible electronic device and readable storage medium Download PDFInfo
- Publication number
- CN113396379A CN113396379A CN201980079779.1A CN201980079779A CN113396379A CN 113396379 A CN113396379 A CN 113396379A CN 201980079779 A CN201980079779 A CN 201980079779A CN 113396379 A CN113396379 A CN 113396379A
- Authority
- CN
- China
- Prior art keywords
- screen
- icon
- floating
- displayed
- window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an interaction method, which is applied to a flexible electronic device, wherein the flexible electronic device comprises a flexible display screen; the flexible display screen is provided with a bending area, and when the flexible electronic device is in a folding state, the flexible display screen is divided into a first screen and a second screen by the bending area; the interaction method comprises the following steps: when the flexible electronic device is in a folded state and the second screen receives the target interaction object shared by the first screen, the floating icon is controlled to be displayed on the second screen, and corresponding indication information is controlled to be displayed in the floating icon. The application also discloses a flexible electronic device (100) and a readable storage medium. The file sharing and interactive operation between the first screen (12) and the second screen (13) of the flexible electronic device (100) can be well achieved.
Description
The present application relates to the field of intelligent terminal technologies, and in particular, to an interaction method, a flexible electronic device, and a readable storage medium.
However, how to realize interactive sharing between different screens (such as dual-screen electronic devices) of the same electronic device or different screen areas (such as a split-screen mode of a single-screen electronic device) in the same screen is a goal pursued by the industry.
Disclosure of Invention
The embodiment of the application discloses an interaction method, a flexible electronic device and a readable storage medium to solve the problems.
The interaction method disclosed by the embodiment of the application is applied to a flexible electronic device, wherein the flexible electronic device comprises a flexible display screen; the flexible display screen is provided with a bending area, and when the flexible electronic device is in a folding state, the flexible display screen is divided into a first screen and a second screen by the bending area; the interaction method comprises the following steps:
when the flexible electronic device is in a folded state and the second screen receives the target interaction object shared by the first screen, the floating icon is controlled to be displayed on the second screen, and corresponding indication information is controlled to be displayed in the floating icon.
The embodiment of the application discloses a flexible electronic device, which comprises a flexible display screen, a memory and a processor; the flexible display screen is provided with a bending area, and when the flexible electronic device is in a folding state, the flexible display screen is divided into a first screen and a second screen by the bending area; interactive program instructions are stored in the memory; the processor executes the interactive program instructions to perform the interactive method as described above.
The readable storage medium disclosed in the embodiment of the application stores a program instruction of an interaction method, and the program instruction is used for executing the interaction method after being called.
The application discloses an interaction method, a flexible electronic device and a readable storage medium, work as the flexible electronic device is in fold condition and the second screen receives and comes from when the target interaction object shared by the first screen, control is in the icon is floated in the display of the second screen to control and show corresponding instruction information in the icon that floats, remind the user who receives the target interaction object, and can also avoid the influence of the received target interaction object on the current display interface on the second screen, play the effect of preventing disturbing promptly when having played the effect of reminding to the user. Therefore, information or application sharing interaction between the first screen and the second screen can be better achieved.
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic view of a flexible electronic device in a folded state according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a flexible electronic device in a fully folded state according to an embodiment of the present application.
Fig. 3 is a schematic view of a flexible electronic device in an unfolded state according to an embodiment of the present application.
Fig. 4 is a flowchart of an interaction method in the first embodiment of the present application.
FIG. 5 is a diagram illustrating a floating window and a floating icon according to an embodiment of the present application.
FIG. 6 is a diagram illustrating an operation window displayed around a floating icon.
Fig. 7 is a flowchart of an interaction method in the second embodiment of the present application.
Fig. 8 is a flowchart of an interaction method in the third embodiment of the present application.
Fig. 9 is a flowchart of an interaction method in a fourth embodiment of the present application.
Fig. 10 is a schematic diagram illustrating a display position of a floating window or an operation window in an embodiment of the present application.
Fig. 11 is a block diagram of a flexible electronic device according to an embodiment of the present application.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein without making any inventive effort fall within the scope of the present application.
It is to be understood that the terminology used in the embodiments of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1-3, the flexible electronic device 100 disclosed in the present application may be, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, etc., and is not limited herein. The Network environment of the flexible electronic device 100 includes, but is not limited to, the internet, a wide area Network, a local area Network, a metropolitan area Network, a Virtual Private Network (VPN), and the like.
The flexible electronic device 100 refers to an electronic device having a flexible screen. Such an electronic device can be bent, so that when a user uses the flexible electronic device 100, the flexible electronic device can be bent into a desired shape, so as to adapt the shape of the flexible electronic device to the current use requirement of the user. On the other hand, when the user does not need to use the device, the occupied space can be reduced by bending, and the portability of the device is improved.
In some embodiments, the flexible electronic device 100 includes a flexible display screen 10. The flexible display screen 10 may be an Organic Light-Emitting Diode (OLED) flexible display screen, and has the characteristics of being bendable, twistable, foldable, better in color and contrast, and ultra-thin.
In some embodiments, the flexible display screen 10 has a bending region 11, and when the flexible electronic device 100 is in a folded state, the flexible display screen 10 is divided into a first screen 12 and a second screen 13 by the bending region 11. When the flexible electronic device 100 is in an unfolded state, the first screen 12, the bending region 11 and the second screen 13 form a continuous plane. The folded state is that an included angle θ between the first screen 12 and the second screen 13 of the flexible electronic device 100 divided by the bending region 11 is smaller than 180 degrees, and when the included angle between the first screen 12 and the second screen 13 is 180 degrees, the flexible electronic device 100 is in the unfolded state. The included angle θ can be obtained by an angle sensor (not shown) disposed on the bending region 11.
In this embodiment, the flexible display screen 10 includes a display surface and a non-display surface opposite to the display surface, and the included angle between the first screen 12 and the second screen 13 is an included angle between the non-display surface of the first screen 12 and the non-display surface of the second screen 13, which is divided by the bending area 11. Therefore, when the flexible electronic device 100 is in a folded state, the display surface of the first screen 12 and the display surface of the second screen 13 may face different users.
In some embodiments, the bending region 11 defines a folding line 101, and the bending region 11 is divided into a first side panel region 111 and a second side panel region 112 by the folding line 101. The first screen 12 is connected to the first side screen area 111, and the second screen 13 is connected to the second side screen area 112. Specifically, the first screen 12 further defines a first display area 102, and the second screen 13 further defines a second display area 103, and when the flexible electronic device 100 is in the unfolded state, the first display area 102 and the second display area 103 are respectively located on two sides of the folding line 101. It is understood that when the flexible electronic device 100 is a full-screen electronic device, the area of the first display area 102 is slightly smaller than the area of the first screen 12, and similarly, the area of the second display area 103 is slightly smaller than the area of the second screen 13.
Referring to fig. 4, a flowchart of an interaction method according to an embodiment of the present application is shown. The interaction method is applied to the flexible electronic device 100, and the interaction method in the embodiment of the present application is described in detail below.
And step S41, when the flexible electronic device is in a folded state and the second screen receives the target interactive object shared by the first screen, controlling to display a floating icon on the second screen.
Referring to fig. 5, in some embodiments, the floating icon 105 is substantially circular like a bubble to enhance the visual experience of the user. In other embodiments, the shape of the floating icon 105 may be a star, a square, a polygon, and the like, which is not limited herein.
In step S42, control displays corresponding instruction information in the floating icon.
The indication information comprises at least one of character indication information, number indication information and color indication information. For example, when the indication information is digital indication information, the number of target interaction objects shared to the second screen 13 may be indicated. As shown in fig. 5, when the number 1 is displayed in the floating icon 105, it may indicate that there are currently 3 target interaction objects shared to the second screen 13. In other embodiments, the text indication information may also be used, for example, the letter a is used to indicate the number 1, and the letter b is used to indicate the number 2, which is not limited herein. In addition, the status of the currently received target interactive object may also be indicated by color indication information. For example, when the indication information is red, it may indicate that a larger number of target interactive objects are currently received, and when the indication information is green, it may indicate that a smaller number of target interactive objects are currently received.
It should be noted that, when an application interface is displayed on the second screen 13, the floating icon 105 is suspended on the currently displayed application interface. For example, when a video playing application interface is displayed on the second screen 13, the floating icon 105 may be suspended on the current video playing application interface.
The interaction method disclosed by the embodiment of the application, when the second screen 13 receives the target interaction object shared by the first screen, the floating icon 105 is displayed on the second screen 13, and the corresponding indication information is displayed in the floating icon 105, so that a user receiving the target interaction object is reminded, the influence of the received target interaction object on the existing display interface on the second screen 13 can be avoided, and the effect of preventing disturbance is achieved while the user is reminded. In this way, information or application sharing interaction between the first screen 12 and the second screen 13 can be better achieved.
In some embodiments, the controlling displays a floating icon 105 on the second screen 13 and controls display of corresponding indication information within the floating icon, including: judging whether the floating icon 105 is displayed on the second screen 13; when the floating icon 105 is not displayed on the second screen 13, creating the floating icon 105 and controlling the created floating icon 105 to be displayed on the second screen 13 and corresponding indication information to be displayed in the created floating icon 105; when a floating icon 105 is displayed on the second screen 13, the control changes the indication information displayed on the floating icon 105.
Please refer to fig. 6, which is a diagram illustrating an operation window 106 displayed around the floating icon 105. In some embodiments, when the floating icon 105 receives a touch operation applied to the floating icon 105 by a user, the floating icon controls to display an operation window 106 on the second screen 13, and the identification information related to the target interactive object, and the first operation icon 1061 and the second operation icon 1062 for controlling the target interactive object to perform a preset operation are displayed in the operation window 106. The identification information refers to the type of the target interactive object. For example, the identification information may be one of a link, a picture, and an application.
In some embodiments, when the number of the target interaction objects is multiple, each target interaction object corresponds to one first operation icon 1061 and one operation icon 1062. In other embodiments, all the interaction objects may share the same first operation icon 1061 and the same second operation icon 1062. When all the interaction objects share the same first operation icon 1061 and the same second operation icon 1062, a target interaction object to be operated needs to be selected first, and then the first operation icon 1061 or the second operation icon 1062 is selected for operation.
In some embodiments, the operating window 106 is displayed around the floating icon 105.
In some embodiments, after the operating window 106 appears, if the floating icon 105 receives the touch operation applied by the user again, the operating window 106 is controlled not to be displayed on the second screen 13; or, if the operation window 106 does not receive the operation of the user within a predetermined time, controlling not to display the operation window 106 on the second screen 13. In this way, the hiding of the operation window 106 can be controlled according to the operation or time of the user to reduce the obstruction of the application interface currently displayed on the second screen 13. The operation window 106 does not receive the user operation within the predetermined time, which means that the first operation icon 1061 and the second operation icon 1062 of each target interaction object do not receive the user operation within the preset time. For example, after the operation window 106 is displayed, if the operation of the user is not received within 3S, the operation window 106 is controlled not to be displayed, so as to reduce the occlusion of the current application interface.
Referring to fig. 7, fig. 7 is a flowchart illustrating an interaction method according to a second embodiment of the present application. In some embodiments, after the operation window 106 is displayed, the interaction method further includes the following steps.
Step S71, judging whether one of the first operation icon or the second operation icon receives user operation, if yes, entering step S72; if not, the process proceeds to step S73.
Step S72, executing an instruction corresponding to the user operation received by one of the first operation icon or the second operation icon.
Specifically, when the first operation icon 1061 receives a user operation, the operation control received by the first operation icon 1061 is used to display details of a target interaction object corresponding to the first operation icon 1061 of the received user operation on the second screen 13 or run an application program corresponding to the target interaction object corresponding to the first operation icon 1061 of the received user operation; when the second operation icon 1062 receives a user operation, the identification information, the first operation icon 1061, and the second operation icon 1062, which are displayed in the operation window 106 and are related to the target interactive object corresponding to the second operation icon 1062 that receives the user operation, are deleted in response to the operation received by the second operation icon 1062.
In this embodiment, the first operation icon 1061 is used to open a corresponding target processing object, and the second operation icon 1062 is used to close a corresponding target interaction object. For example, the first handle icon 1061 is an "on" handle icon, and the second handle icon 1062 is an "off" handle icon. When the target interactive object is not linked and the opening operation icon corresponding to the link receives the operation of the user, calling a browser application program installed in the flexible electronic device 100 to open the link; when the target interactive object is a picture and an opening operation icon corresponding to the picture receives the operation of the user, calling an album application program installed in the flexible electronic device 100 to directly open the picture; when the target interactive object is an application program and the "open" operation icon corresponding to the application program receives the operation of the user, the application program is directly operated in the foreground, and the application interface of the application program is displayed in the second screen 13. And if the closing operation icon receives the operation of the user, the target interactive object corresponding to the closing operation icon is not responded.
In other embodiments, when the second operation icon 1062 receives a user operation, the control unit may send feedback information to the first screen 12 to reject receiving the target interactive object. For example, when the second operation icon 1062 receives an operation by the user, feedback information of "the other party has rejected" may be displayed on the first screen 12. Of course, when the first operation icon 1061 receives the operation of the user, it may also control sending and receiving feedback information of the target interaction object to the first screen 12. For example, when the first operation icon 1061 receives an operation by the user, feedback information of "the other party has received" may be displayed on the first screen 12.
In addition, in some embodiments, when the first operation icon 1061 or the second operation icon 1062 receives a user operation, only information in the floating icon 105 is controlled to be changed. For example, when a second handle icon 1062 does not receive a user operation, the indication information displayed in the floating icon 105 is numeral 3, indicating that 3 target interaction objects are currently received, and if a second handle icon 1062 receives a user operation, the indication information may be changed to numeral 2, indicating that one target interaction object is currently refused to be received.
Step S73, determining whether at least one first operation icon and one second operation icon of the target interaction object in the operation window have not received a user operation. If yes, go to step S74; if not, step S75 is executed.
And step S74, controlling the operation window and the floating icon to be continuously displayed on the second screen.
When at least one of the first operation icon 1061 and the second operation icon 1062 of the target interaction object in the operation window 106 does not receive the user operation, it indicates that the received target interaction object is not processed, and therefore, the operation window 106 needs to be displayed to prompt the user that the received target interaction object is not processed, and an operation interface is provided for the user.
And step S75, controlling the second screen not to display the operation window and the floating icon.
When at least one of the first handle icon 1061 and the second handle icon 1062 of all the interaction objects in the handle window 106 receives a user operation, indicating that all the received target interaction objects are processed, at this time, the handle window 106 and the floating icon 105 may be controlled to disappear on the second screen 13, so as to avoid affecting the user's continued viewing and use of the second screen 13.
Referring to fig. 8, fig. 8 is a flowchart illustrating an interaction method according to a third embodiment of the present application. In some embodiments, the interaction method further comprises the following steps.
And step S81, when the flexible electronic device is in a folded state, responding to a preset selection operation of a user to select a target interactive object from the first screen.
In one embodiment, the preset selection operation includes, but is not limited to, a clicking operation of touching an area where the interactive object is located, a frame selection operation of drawing a closed frame to enclose the interactive object to be selected, and the like. The number of target interaction objects selected by the user may be one or more, and the specific number is not limited. The target interaction object comprises at least one of a picture, a connection and an application program. For example, the target interactive object may be a picture, a link website, or an application (e.g., "love art").
And step S82, creating a floating window in response to a specific operation applied on the first screen by a user and controlling the floating window to be displayed on the first screen.
Referring again to fig. 5, the floating window 104 is similar to the floating icon 105, and is also substantially circular, similar to a floating bubble, thereby improving the visual experience of the user. In other embodiments, the floating window 104 may also have other shapes such as a square shape, a heart shape, and the like, which is not limited herein.
The specific operation includes, but is not limited to, double-clicking the first screen 12, applying a specific gesture operation on the first screen 12 (e.g., drawing a pattern of a specific shape on the first screen 12, such as a straight line, a broken line, a curved line, a circle, a square, a star, etc.), long-pressing a certain position of the first display area 102 of the first screen 12, and long-pressing a certain physical key (e.g., a volume adjustment key, a Home key) of the flexible device 100, etc.
In some embodiments, the specific operation is a long-press operation on the target interactive object selected by the user. That is, when the user presses the selected target interactive object for a long time, the floating window 104 can be created and controlled to be displayed on the first screen 12, thereby facilitating the user operation.
In another embodiment, the specific operation is that a long press operation is received in an area of the first screen 12 where the non-interactive object is located. Therefore, the operation of triggering the floating window 104 and the operation of touching the interactive object can be distinguished, and the misjudgment of the system is avoided.
It should be noted that, when an application interface is displayed on the first screen 12, the floating window 104 is suspended on the currently displayed application interface.
Step S83, sharing the target interaction object to the second screen in response to the drag operation received by the target interaction object. And dragging the target interactive object into the floating window by the dragging operation.
The dragging operation comprises an operation of touching the current target interactive object by a single finger and moving the target interactive object, and an operation of touching the current target interactive object by a double finger and moving the target interactive object.
And step S84, controlling to display an operation window on the second screen and controlling to display corresponding indication information in the floating icon.
In some embodiments, the target interaction object is received by the second screen 13 when the target interaction object is dragged into the floating window 104.
It should be noted that, in some embodiments, the order of step S81 and step S82 may be interchanged.
According to the interaction method disclosed by the application, when the flexible electronic device 100 is in a folded state, a target interaction object is selected from the first screen 12 in response to a preset selection operation of a user, then a floating window 104 is created in response to a specific operation applied to the first screen 12 by the user, the floating window 104 is controlled to be displayed on the first screen 12, and then the target interaction object is shared to the second screen 13 in response to a dragging operation received by the target interaction object. In this way, information or application sharing interaction between the first screen 12 and the second screen 13 can be better achieved. In addition, the target interaction object is shared to the second screen 13 by dragging the target interaction object into the floating window 104, so that the target interaction object can be distinguished from the operation of moving the file in the screen, the situation that the mobile file and the shared file are mixed is avoided, and the user experience is improved.
In other embodiments, when the target interactive object displayed on the first screen 12 is dragged onto the floating window 104 on the first screen 12, it is determined that the second screen 13 receives the target interactive object shared from the first screen 12, wherein the target interactive object is selected when the first screen 12 receives a preset selection operation; when the first screen 12 receives a specific operation, the floating window 104 is created and displayed on the first screen 12.
Referring to fig. 9, a flowchart of an interaction method in a fourth embodiment of the present application is shown. In some embodiments, step S83 and step S84 in fig. 8 may be omitted, and the interaction method further includes the steps of:
step S91, determine whether the flexible electronic device is in the tent mode. If yes, go to step S81; if not, the process ends.
The tent mode is that the flexible electronic device 100 is in a folded state, and an included angle θ between the first screen 12 and the second screen 13 is greater than a first preset angle and smaller than a second preset angle. In addition, the operation of the first screen 12 and the operation of the second screen 13 are not affected by each other when the flexible electronic device 100 is in the tent mode or not.
In order to enable the flexible electronic device 100 to be stably placed on a carrying object (e.g. a desktop) such that the first screen 12 and the second screen 13 face different users without being affected by the different users, it is preferable that the first predetermined angle is 30 degrees and the second predetermined angle is 150 degrees. In other embodiments, the first preset angle and the second preset angle may be set according to specific design requirements.
Step S92, judging whether at least one target interaction object in the floating window is not shared to the second screen; if yes, go to step S93; if not, step S94 is executed.
In step S93, the floating window 104 is controlled to be continuously displayed on the first screen 12.
When at least one target interactive object in the floating window 104 is not shared to the second screen 13, the floating window 104 is controlled to be continuously displayed on the first screen 12 to remind a user that the current target interactive object is not shared, so that the user can clearly know the sharing process of the target interactive object, and the user experience is improved.
Step S94, controlling the floating window 104 not to be displayed on the first screen 12.
In the present embodiment, when all the target interaction objects dragged into the floating window 104 are shared to the second screen 13, the floating window 104 will affect the viewing and using experience of the user if continuously displayed on the first screen 12. Therefore, when the target interaction objects are all shared to the second screen 13, the floating window 104 can be controlled to disappear, and further, the influence on the viewing and the use of the first screen 12 due to the appearance duration of the floating window 104 can be reduced.
In some embodiments, the interaction method further comprises the steps of: changing the position of the floating window 104 in response to a particular gesture operation applied to the floating window 104; and/or, changing the position of the floating icon 105 in response to a particular gesture operation applied to the floating icon 105. Here, "change the position of the floating window 104" and "change the position of the floating icon 105" mean that the floating window 104 and the floating icon 105 are not changed in area, and the positions of the floating window 104 and the floating icon 105 are changed, thereby moving the floating window 104 and the floating icon 105. The specific gesture operations for moving the floating window 104 and the operation window 106 include, but are not limited to, touching the floating window 104 or the floating icon 105 with a single finger, touching the floating window 104 or the floating icon 105 with a double finger.
Please refer to fig. 10, which is a diagram illustrating a display position of the floating window 104 or the floating icon 105. As shown in fig. 10, the first screen 12 defines a first display area 102 and a first non-display area 107, and the second screen defines a second display area 103 and a first non-display area 108. The floating window 104 and the floating icon 105 are always displayed at the boundary of the first display area 102 and the second display area 103 and thus the visual impact on the user can be reduced. The boundary refers to an area of the first display area 102 close to the first non-display area 107 and an edge area S of the second display area 103 close to the second non-display area 108. Specifically, the shape and size of the edge region S are not limited herein.
It is understood that in other embodiments, when the flexible display screen 10 is a frameless or full screen, the first screen 12 and the second screen 13 may only define a display area and no non-display area. Therefore, the floating window 104 and the floating icon 105 are only required to be displayed on the boundary between the first display area 104 and the second display area 105.
It should be noted that, although the floating window 104 and the floating icon 105 may move anywhere within the display area according to the operation of the user, they eventually stay at the edge area of the display area. For example, when a specific gesture operation is applied to the floating window 104 or the floating icon 105 by the user, the floating window 104 or the floating icon 105 is moved to the central position of the display area, but when the specific gesture operation is moved away by the user, the floating window 104 or the floating icon 105 is automatically moved to the edge area of the display area.
Please refer to fig. 11, which is a block diagram illustrating a flexible electronic device 100 according to an embodiment of the present application. The flexible electronic device 100 includes, but is not limited to, a flexible display screen 10, a memory 20, and a processor 30. In particular, the flexible display 10, the memory 20 and the processor 30 may be coupled by a communication bus 40. It should be understood by those skilled in the art that fig. 10 is only an example of the flexible electronic device 100 and does not constitute a limitation to the flexible electronic device 100, and the flexible electronic device 100 may include more or less components than those shown in fig. 1, or some components may be combined, or different components, for example, the flexible electronic device 100 may further include an input and output device, a network access device, etc.
The memory 20 may be used for storing computer programs and/or modules, and the processor 30 implements various functions of the flexible electronic device 100 by running or executing the computer programs and/or modules stored in the memory 20 and calling data stored in the memory 20. The memory 20 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, application programs (such as a sound playing function, an image playing function, etc.) required for a plurality of functions, and the like; the data storage area may store data (such as audio data, a phonebook, etc.) created according to the use of the flexible electronic device 100, and the like. In addition, the memory 20 may include a high speed random access memory, and may also include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), a plurality of magnetic disk storage devices, a Flash memory device, or other volatile solid state storage devices.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware component, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center of the flexible electronic device 100, various interfaces and lines connecting the various parts of the entire flexible electronic device 100.
Wherein, the memory 20 stores therein interactive program instructions, and the processor 30 is further configured to execute the interactive program instructions to perform all the steps of the above method when the flexible electronic device 100 is in the folded state.
It should be noted that, for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the order of acts described, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The interaction methods provided herein may be implemented in hardware, firmware, or as software or computer code that may be stored in a readable storage medium such as a CD, ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code that is originally stored on a remote recording medium or a non-transitory machine-readable medium, downloaded over a network, and stored in a local recording medium, so that the methods described herein may be presented in software stored on the recording medium using a general purpose computer or special purpose processor, or in programmable or special purpose hardware such as an ASIC or FPGA. As can be appreciated in the art, a computer, processor, microprocessor, controller or programmable hardware includes a memory component, e.g., RAM, ROM, flash memory, etc., which can store or receive software or computer code when the computer, processor or hardware accesses and executes the software or computer code implementing the processing methods described herein. In addition, when a general-purpose computer accesses code for implementing the processing shown herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the processing shown herein.
The readable storage medium may be a solid-state memory, a memory card, an optical disc, etc. The readable storage medium stores program instructions for the computer to call and then execute the interaction method.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and embodiments of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (20)
- An interaction method is applied to a flexible electronic device, and the flexible electronic device comprises a flexible display screen; the flexible display screen is provided with a bending area, and when the flexible electronic device is in a folding state, the flexible display screen is divided into a first screen and a second screen by the bending area; the interactive method is characterized by comprising the following steps:when the flexible electronic device is in a folded state and the second screen receives the target interaction object shared by the first screen, the floating icon is controlled to be displayed on the second screen, and corresponding indication information is controlled to be displayed in the floating icon.
- The interaction method of claim 1, wherein said controlling displays a floating icon on said second screen and controls display of corresponding pointing information within the floating icon, comprises:judging whether the floating icon is displayed on the second screen;when the floating icon is not displayed on the second screen, the floating icon is created, and the created floating icon is controlled to be displayed on the second screen and corresponding indication information is displayed in the created floating icon.
- The interaction method of claim 1, wherein said controlling displays a floating icon on said second screen and controls display of corresponding pointing information within the floating icon, comprises:judging whether the floating icon is displayed on the second screen;and when the floating icon is displayed on the second screen, controlling to change the indication information displayed on the floating icon.
- The interaction method of claim 1, wherein the indication information comprises at least one of textual indication information, numerical indication information, and color indication information.
- The interaction method according to claim 1, wherein when the floating icon receives a touch operation applied to the floating icon, an operation window is controlled to be displayed on the second screen, and identification information related to the target interaction object, a first operation icon and a second operation icon for controlling the target interaction object to perform a preset operation are displayed within the operation window.
- The interaction method of claim 5, wherein the interaction method further comprises:judging whether one of the first operation icon or the second operation icon in the operation window receives the operation of a user;and when the first operation icon receives the operation of the user, responding to the received operation control to display the detailed content of the target interaction object corresponding to the first operation icon which receives the operation of the user on the second screen or run the application program corresponding to the target interaction object corresponding to the first operation icon which receives the operation of the user.
- The interaction method of claim 6, wherein the interaction method further comprises:and when the second operation icon receives the operation of the user, the identification information, the first operation icon and the second operation icon which are displayed in the operation window and are related to the target interaction object corresponding to the second operation icon which receives the operation of the user are deleted in response to the received operation.
- The interaction method according to claim 7, wherein when the first operation icon or the second operation icon receives an operation by a user, the indication information in the floating icon is controlled to be changed.
- The interaction method of claim 5, wherein the interaction method further comprises:judging whether at least one first operation icon and at least one second operation icon of the target interaction object in the operation window do not receive user operation;and when at least one first operation icon and at least one second operation icon of the target interaction object in the operation window do not receive user operation, controlling the operation window and the floating icon to be continuously displayed on the second screen.
- The interactive method of claim 9, wherein the interactive method further comprises:and when one of the first operation icons or the second operation icons of all the target interaction objects in the operation window receives user operation, controlling the second screen not to display the operation window and the floating icon.
- The interaction method of claim 1, wherein the interaction method further comprises:judging whether the flexible electronic device is in a tent mode or not; when the included angle between the first screen and the second screen is larger than a first preset angle and smaller than a second preset angle, the flexible electronic device is determined to be in the tent mode, and the operation of the first screen and the operation of the second screen are not influenced mutually in the tent mode.
- The interaction method of claim 1, wherein the target interaction object comprises at least one of a picture, a link, and an application.
- The interaction method according to claim 5, wherein when the operation window does not receive a user operation within a preset time, the operation window is controlled not to be displayed on the second screen; or the like, or, alternatively,and when the floating icon receives the touch operation applied by the user again, controlling the operation window not to be displayed on the second screen.
- The interaction method according to claim 1, wherein it is determined that the second screen receives the shared target interaction object from the first screen when the target interaction object displayed on the first screen is dragged onto a floating window on the first screen, wherein the target interaction object is selected when the first screen receives a preset selection operation, and the floating window is created and displayed on the first screen when the first screen receives a specific operation.
- The interactive method of claim 14, wherein the interactive method further comprises:judging whether all the target interaction objects in the floating window are shared to the second screen or not;and when all target interaction objects in the floating window are shared to the second screen, controlling the floating window not to be displayed on the first screen.
- The interactive method of claim 15, wherein when at least one of the target interactive objects in the floating window is not shared to the second screen, the floating window is controlled to be continuously displayed on the first screen.
- The interactive method of claim 14, wherein the interactive method further comprises:changing the position of the floating window in response to a specific gesture operation applied to the floating window; or the like, or, alternatively,changing a position of the floating icon in response to a particular gesture operation applied to the floating icon.
- An interactive method, according to claim 14 or claim 17, wherein said first screen defines a first display area and said second screen defines a second display area; the floating window and the floating icon are displayed on the boundary of the first display area and the second display area.
- A flexible electronic device comprises a flexible display screen, a memory and a processor; the flexible display screen is provided with a bending area, and when the flexible electronic device is in a folding state, the flexible display screen is divided into a first screen and a second screen by the bending area; the method is characterized in that interactive program instructions are stored in the memory; the processor executes the interactive program instructions to perform the interactive method of any one of claims 1-18.
- A readable storage medium having stored thereon interactive program instructions for execution by a processor upon invocation of the interactive method according to any one of claims 1-18.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/082259 WO2020206652A1 (en) | 2019-04-11 | 2019-04-11 | Interaction method, flexible electronic device and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113396379A true CN113396379A (en) | 2021-09-14 |
Family
ID=72751815
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980079779.1A Pending CN113396379A (en) | 2019-04-11 | 2019-04-11 | Interaction method, flexible electronic device and readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113396379A (en) |
WO (1) | WO2020206652A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023155836A1 (en) * | 2022-02-18 | 2023-08-24 | 维沃移动通信有限公司 | Display method and apparatus, and electronic device and readable storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115113834A (en) * | 2021-03-22 | 2022-09-27 | Oppo广东移动通信有限公司 | Display method, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881208A (en) * | 2015-06-05 | 2015-09-02 | 广东欧珀移动通信有限公司 | To-be-processed message display control method and device |
CN107643912A (en) * | 2017-08-31 | 2018-01-30 | 维沃移动通信有限公司 | A kind of information sharing method and mobile terminal |
CN108803964A (en) * | 2018-06-08 | 2018-11-13 | Oppo广东移动通信有限公司 | Buoy display methods, device, terminal and storage medium |
CN108958580A (en) * | 2018-06-28 | 2018-12-07 | 维沃移动通信有限公司 | A kind of display control method and terminal device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109525697A (en) * | 2017-09-19 | 2019-03-26 | 阿里巴巴集团控股有限公司 | Contact person shares and the method, apparatus and terminal of display |
-
2019
- 2019-04-11 WO PCT/CN2019/082259 patent/WO2020206652A1/en active Application Filing
- 2019-04-11 CN CN201980079779.1A patent/CN113396379A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881208A (en) * | 2015-06-05 | 2015-09-02 | 广东欧珀移动通信有限公司 | To-be-processed message display control method and device |
CN107643912A (en) * | 2017-08-31 | 2018-01-30 | 维沃移动通信有限公司 | A kind of information sharing method and mobile terminal |
CN108803964A (en) * | 2018-06-08 | 2018-11-13 | Oppo广东移动通信有限公司 | Buoy display methods, device, terminal and storage medium |
CN108958580A (en) * | 2018-06-28 | 2018-12-07 | 维沃移动通信有限公司 | A kind of display control method and terminal device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023155836A1 (en) * | 2022-02-18 | 2023-08-24 | 维沃移动通信有限公司 | Display method and apparatus, and electronic device and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020206652A1 (en) | 2020-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11809702B2 (en) | Modeless augmentations to a virtual trackpad on a multiple screen computing device | |
US9250729B2 (en) | Method for manipulating a plurality of non-selected graphical user elements | |
US9317190B2 (en) | User terminal device for displaying contents and methods thereof | |
EP2815299B1 (en) | Thumbnail-image selection of applications | |
US8413075B2 (en) | Gesture movies | |
US9389777B2 (en) | Gestures for manipulating tables, charts, and graphs | |
US9645733B2 (en) | Mechanism for switching between document viewing windows | |
JP5684291B2 (en) | Combination of on and offscreen gestures | |
US9035887B1 (en) | Interactive user interface | |
US9639238B2 (en) | Modification of a characteristic of a user interface object | |
US20140237378A1 (en) | Systems and method for implementing multiple personas on mobile technology platforms | |
US20120084681A1 (en) | Application launch | |
JP2013520727A (en) | Off-screen gestures for creating on-screen input | |
US20210200391A1 (en) | Interacting Method for Sidebar Menu, Apparatus and Computer-readable Storage Medium | |
US11366579B2 (en) | Controlling window using touch-sensitive edge | |
WO2021067048A1 (en) | Transitions and optimizations for a foldable computing device operating in a productivity mode | |
CN113396379A (en) | Interaction method, flexible electronic device and readable storage medium | |
CN113330407A (en) | Interaction method, flexible electronic device and readable storage medium | |
KR102301903B1 (en) | Method and apparatus for providing contents associated with side bar using user terminal including a plurality of displays | |
CN112130741A (en) | Control method of mobile terminal and mobile terminal | |
WO2022166516A1 (en) | Screen splitting method and apparatus, electronic device, and storage medium | |
CN113348434A (en) | Interaction method, flexible electronic device and readable storage medium | |
CN112689818A (en) | Anti-disturbance method, electronic device and computer readable storage medium | |
CN113330415A (en) | Interactive method, display device and computer-readable storage medium | |
KR102138531B1 (en) | Mobile terminal and method for controlling thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210914 |
|
WD01 | Invention patent application deemed withdrawn after publication |