WO2021057699A1 - 具有柔性屏幕的电子设备的控制方法及电子设备 - Google Patents

具有柔性屏幕的电子设备的控制方法及电子设备 Download PDF

Info

Publication number
WO2021057699A1
WO2021057699A1 PCT/CN2020/116714 CN2020116714W WO2021057699A1 WO 2021057699 A1 WO2021057699 A1 WO 2021057699A1 CN 2020116714 W CN2020116714 W CN 2020116714W WO 2021057699 A1 WO2021057699 A1 WO 2021057699A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
electronic device
control
double
flexible
Prior art date
Application number
PCT/CN2020/116714
Other languages
English (en)
French (fr)
Inventor
梁树为
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021057699A1 publication Critical patent/WO2021057699A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the embodiments of the present application relate to the field of terminal technology, and in particular, to a method for controlling an electronic device with a flexible screen and an electronic device.
  • Flexible screens can also be called flexible OLEDs (organic light-emitting diodes). Compared with traditional screens, flexible screens are not only lighter and thinner in volume, but also based on their flexibility and flexibility. The durability is also much higher than that of traditional screens. At present, some equipment manufacturers have applied flexible screens to electronic devices such as mobile phones and tablet computers. Users can use the function combination keys of the mobile phone (such as the combination of the volume key and the switch key) or the flexible screen in the folded or unfolded state. System-level button operation triggers a screenshot of the screen (single screen or the entire flexible screen) in the bright screen state. However, the function combination keys usually require two-handed operation and have high requirements for pressing timing. System-level buttons require more steps to trigger screenshots, and the operation steps are more complicated. It can be seen that the current screenshot method of foldable mobile phones is not convenient enough, and the efficiency of human-computer interaction is low. Similar problems exist for operations such as multiple selection of controls, icon movement, and deletion of controls.
  • the present application provides a method for controlling an electronic device with a flexible screen and an electronic device.
  • the electronic device can perform screenshots, multiple selection of controls, icon movement, and deletion of controls in response to a double-sided screen gesture operation, thereby improving the efficiency of human-computer interaction.
  • the present application provides a method for controlling an electronic device with a flexible screen.
  • the physical form of the flexible screen can include an unfolded state and a folded state.
  • the flexible screen is divided into a first screen and a
  • the above-mentioned control method includes: when the flexible screen is in the on-screen state, the electronic device detects a double-sided screen gesture operation for triggering a screenshot, and the double-sided screen gesture operation includes the first screen on the first screen.
  • the operation time difference between the first operation and the second operation is less than the preset time difference, the touch position of the first operation corresponds to the touch position of the second operation; in response to the double-sided screen gesture operation, the electronic The device takes a screenshot of the flexible screen in the bright screen state.
  • the electronic device may calculate the included angle between the first screen and the second screen according to the data detected by the acceleration sensor and the gyroscope, and then determine that the flexible screen is in the unfolded state or the folded state according to the included angle.
  • the electronic device can also determine the current physical form of the flexible screen in other ways.
  • the electronic device can use a distance sensor to detect the distance between the main screen and the secondary screen, and then determine the current physical form of the flexible screen based on the distance. This application implements The example does not impose any restrictions on this.
  • the operation time for the user to trigger an operation on the screen refers to the initial moment when the user's finger or palm touches the screen.
  • the above-mentioned operation time difference between the first operation and the second operation refers to the time difference between the time when the user's finger or palm touches the first screen and the time when the user's finger or palm touches the second screen.
  • the preset time difference can be set to 500 ms or less.
  • the touch position of the first operation corresponds to the touch position of the second operation.
  • the position difference between the touch positions of the first operation and the second operation is smaller than the preset position difference, because the touch positions of the first operation and the second operation are different On the screen, the mobile phone needs to convert the touch points on the screen of the first operation and the second operation to the same screen coordinate system through the conversion of the screen coordinate system, such as the first screen coordinate system, the second screen coordinate system, or different from the first screen coordinate system.
  • the preset position difference can be set to 10dp.
  • first screen and second screen are only used to distinguish the display area of the flexible screen of the electronic device, and do not represent the importance or priority of the screen.
  • the electronic device can operate according to the user's double-sided screen gesture operation on the flexible screen and the current display state of the flexible screen (bright screen). Status or black screen status) Take screenshots of the corresponding screen.
  • the user does not need to call the notification bar or operating controls (such as virtual buttons) on the flexible screen of the electronic device, and then click the corresponding screenshot control to capture the screen content.
  • the above-mentioned double-sided screen gesture operation is more convenient and faster, and provides a better user experience for users.
  • the electronic device in response to the double-sided screen gesture operation, takes a screenshot of the flexible screen in the bright screen state, including: if the first screen is in the bright screen state and the second screen is in the black screen state, respond In the double-sided screen gesture operation, the electronic device takes a screenshot of the first screen; if the first screen is in the black screen state, the second screen is in the bright screen state, and in response to the double-sided screen gesture operation, the electronic device takes a screenshot of the second screen ; If the first screen and the second screen are both in the on-screen state, in response to the double-sided screen gesture operation, the electronic device takes a screenshot of the entire flexible screen.
  • the electronic device can obtain the display parameters of the first screen and the second screen through the display driver, so as to determine the display state of the flexible screen before the user triggers the double-sided screen gesture operation.
  • the screen is in a black screen state.
  • the electronic device can take a screenshot of a screen or the entire screen in the on-screen state according to the user's screen display state before triggering the double-sided screen gesture operation.
  • the method further includes: displaying a screenshot preview interface.
  • the screenshot preview interface may include at least one of the following processing controls: save control, edit control, share control, or cancel control.
  • displaying the screenshot preview interface includes: if the first screen is in the bright screen state and the second screen is in the black screen state, displaying the screenshot preview interface on the first screen; if the first screen is in the black screen state, The second screen is in the bright screen state, and the screenshot preview interface is displayed on the second screen; if the first screen and the second screen are both in the bright screen state, the screenshot preview interface is displayed on the user-facing screen, and the user-facing screen is the first Screen or second screen.
  • the electronic device after the electronic device completes the screenshot, it can display the screenshot preview interface on the screen in the bright screen state.
  • the electronic device needs to determine the screen facing the user, and A screenshot preview interface is displayed on the screen facing the user.
  • the electronic device determines the screen facing the user through detection data reported by at least one of an infrared sensor, a camera, a proximity light sensor, or a touch device of a flexible screen.
  • the method further includes: the electronic device detects a third operation for triggering selection of a processing control on the screenshot preview interface; in response to the third operation, the electronic device The screenshot executes the execution processing corresponding to the selected processing control.
  • the screen preview interface is displayed on the screen.
  • the user can click the corresponding processing controls on the screenshot preview interface to save, edit, share, and cancel the screenshot, for example, the user clicks the screenshot to preview
  • the editing controls on the interface can add text, stickers, or filters to the screenshot.
  • the user can click the share control on the screenshot preview interface to send the screenshot to friends or family through the application.
  • the operation types of the first operation and the second operation are the same, and the operation types include any one of the following: a press operation, a click operation, a double-click operation, and a long-press operation.
  • the pressing operation is also called pressure-sensitive operation, and the following conditions should be met: the operation duration of the user's touch operation on the screen is greater than or equal to the preset duration, the coordinate position of the touch point of the touch operation does not change, and the touch The pressing value of the operation is greater than or equal to the preset pressure value.
  • the click operation should satisfy: the operation duration of the user's touch operation on the screen is less than the preset duration.
  • the double-click operation should satisfy: the user triggers two click operations on the screen, and the operation time interval of the two click operations is less than the preset time interval.
  • the long press operation should meet the following conditions: the operation duration of the user's touch operation on the screen is greater than or equal to the preset duration, and the pressing value of the touch operation is less than the preset pressure value.
  • the present application provides a method for controlling an electronic device with a flexible screen.
  • the physical form of the flexible screen includes an unfolded state and a folded state.
  • the flexible screen is divided into a first screen and a second screen.
  • the second screen specifically, the above-mentioned control method includes: when at least two selectable controls are displayed on the display interface of the first screen, the electronic device detects a double-sided screen gesture operation for triggering multiple selections, and the double-sided screen gesture operation includes For the first operation on the first screen and the second operation on the second screen, the operation time difference between the first operation and the second operation is less than the preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation; in response to In the double-sided screen gesture operation, the electronic device controls the display interface of the first screen to enter the multi-selection mode, wherein, in the multi-selection mode, at least two selectable controls on the display interface of the first screen can be selected.
  • the above-mentioned optional controls refer to controls that can be selected and processed on the display interface of the screen.
  • the optional control can be the picture control on the picture browsing interface in the album.
  • the user can select multiple picture controls at the same time to splice multiple pictures (ie picture editing), or send multiple pictures to friends in batches ( That is, picture sharing), or add multiple pictures to a new album folder (ie, picture organization).
  • the above-mentioned multi-selection mode refers to an operation mode in which the user can select at least two optional controls on the display interface of the screen and perform batch processing.
  • the electronic device can follow the user's double-sided screen gestures on the flexible screen. Operate, control the display interface of the corresponding screen to quickly enter the multi-select mode. In this way, the user does not need to enter the multi-selection mode by clicking the "select" control on the designated position (for example, the upper left corner of the screen).
  • the above-mentioned double-sided screen gesture operation is more convenient and faster, and provides a better user experience for users.
  • the electronic device when the flexible screen is in the folded state, at least two optional controls are displayed on the display interface of the first screen, and the second screen is in the black screen state, or the second screen is in the bright screen state but the second screen
  • the electronic device detects a double-sided screen gesture operation for triggering multiple selection, and in response to the double-sided screen gesture operation, the electronic device can control the display interface of the first screen to enter the multiple selection mode.
  • the electronic device controls the display interface of the first screen to enter the multi-selection mode, including: responding to the double-sided screen gesture operation, if the first screen is a user-facing screen , The electronic device controls the display interface of the first screen to enter the multi-selection mode.
  • the electronic device also needs to detect the specific orientation of the current electronic device, and further determine whether to control the display interface of the first screen to enter the multi-selection mode according to the specific orientation of the electronic device. That is, if the screen to be controlled by the electronic device faces away from the user, the electronic device does not trigger the corresponding control, thereby avoiding the user's misoperation.
  • the electronic device controls the display interface of the user-facing screen to enter the multi-selection mode, and the user-facing screen is the first screen or the second screen.
  • the display interfaces of the two screens of the electronic device have optional controls.
  • the electronic device determines which screen display interface enters the multi-selection mode according to the specific orientation of the current electronic device. If the user-facing screen is the first screen, The electronic device controls the display interface of the first screen to enter the multi-selection mode. If the user-facing screen is the second screen, the electronic device controls the display interface of the second screen to enter the multi-selection mode.
  • the electronic device controls the display interface of the first screen to enter the multi-selection mode, including: if the touch position of the first operation on the first screen corresponds to the first selectable Controls, the first selectable control is one of the selectable controls displayed on the first screen.
  • the electronic device controls the display interface of the first screen to enter the multi-select mode and displays the first selectable control Is selected.
  • the user can directly select an optional control on the first screen while triggering the display interface of the first screen to enter the multi-selection mode.
  • the method further includes: the electronic device detects that the display interface used to trigger the first screen In the third operation of selectable control selection, the operation type of the third operation is a click operation or a sliding operation; in response to the third operation, the electronic device displays one or more controls corresponding to the third operation as selected.
  • the user can add one or more optional controls through this implementation method to complete multiple selection operations.
  • the operation types of the first operation and the second operation are the same, and the operation types include any one of the following: a press operation, a click operation, a double-click operation, and a long-press operation.
  • the present application provides a method for controlling an electronic device with a flexible screen.
  • the physical form of the flexible screen includes an unfolded state and a folded state.
  • the flexible screen is divided into a first screen and a second screen.
  • the second screen specifically, the above-mentioned control method includes: when the display interfaces of the first screen and the second screen are both interfaces that can accommodate icon controls, and the icon controls are displayed on the display interface of the first screen, the electronic device detects For the double-sided screen gesture operation that triggers icon movement, the double-sided screen gesture operation includes a first operation on the first screen and a second operation on the second screen, and the operation time difference between the first operation and the second operation is less than the preset time difference, The touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to the first icon control on the first screen; in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to The second screen.
  • the aforementioned icon controls are icon controls that can be moved, and the icon controls on the display interface can be arranged in the form of nine-square grid, sixteen-square grid, and the like.
  • the aforementioned icon control may be an icon control of an application program on the main interface of the electronic device, or may be an icon control in an application, which is not limited in this application.
  • the dual-screen display interface can accommodate icon controls, and the electronic device can The double-sided screen gesture operation on the screen controls the operation of the corresponding icon control to move across the screen. In this way, the user can move the icon control corresponding to the operation to another screen without expanding the electronic device.
  • the above-mentioned double-sided screen gesture operation is more convenient and faster, and provides a better user experience for users.
  • the electronic device detects In response to the double-sided screen gesture operation that triggers the movement of the icon, the electronic device moves the first icon control from the first screen to the second screen in response to the above-mentioned double-sided screen gesture operation.
  • the electronic device in response to the double-sided screen gesture operation, moves the first icon control from the first screen to the second screen, including: responding to the double-sided screen gesture operation, if the first screen is facing On the user's screen, the electronic device moves the first icon control from the first screen to the second screen.
  • the electronic device also needs to detect the specific orientation of the current electronic device, and further determine whether to move the first icon control according to the specific orientation of the electronic device. That is, if the electronic device determines that the screen where the first icon control to be moved is located is facing away from the user, the electronic device does not trigger the icon movement, thereby avoiding the user's misoperation.
  • the display interfaces of the first screen and the second screen are both interfaces that can accommodate icon controls
  • at least one icon control is displayed on the display interface of the first screen
  • the display interface of the second screen is
  • the first operation corresponds to the first icon control on the display interface of the first screen
  • the second operation corresponds to the second icon control on the display interface of the second screen
  • the display interfaces of the two screens of the electronic device have icon controls, and the touch operations on the two screens correspond to the icon controls.
  • the electronic device can further determine the icon control on which screen according to the specific orientation of the current electronic device Should be moved. If the user-facing screen is the first screen, the electronic device controls the icon controls on the first screen to move to the second screen. If the user-facing screen is the second screen, the electronic device controls the The icon control moves to the first screen.
  • the display interfaces of the first screen and the second screen are both interfaces that can accommodate icon controls
  • at least one icon control is displayed on the display interface of the first screen
  • the display interface of the second screen is
  • the electronic device moves the first icon control from the first screen to the front of the at least one icon control on the second screen, and changes the at least one icon on the second screen
  • the control sequence moves backward; or, in response to a double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to behind at least one icon control on the second screen.
  • This implementation limits the position where the first icon control moves to the second screen. If there are multiple icon controls on the second screen, the first icon control can be moved to the front or the back of the multiple icon controls. For example, there are originally two icon controls A and B arranged in sequence on the second screen. A is located in front of B. The electronic device can move the icon control C on the first screen to the position of A, and A and B are moved backward in sequence. Or, C moves behind B, and the positions of A and B on the second screen remain unchanged.
  • the electronic device moves the first icon control from the first screen to the second screen, including: the electronic device moves the first icon control from the first screen to the second screen by a second operation touch position.
  • the electronic device moves the first icon control from the first screen to the second operation touch position of the second screen, including: electronic The device moves the first icon control from the first screen to the touch position of the second operation on the second screen, and moves the third icon control on the second screen back; or the electronic device moves the first icon control from the first screen Move to the touch position of the second operation on the second screen, and merge the third icon control on the second screen with the first icon control.
  • the electronic device needs to move all the icon controls behind the touch position sequentially, or , The moved first icon control is directly merged with the icon control at the touched position into the same folder.
  • the operation types of the first operation and the second operation are the same, and the operation types include any one of the following: a press operation, a click operation, a double-click operation, and a long-press operation.
  • the present application provides a method for controlling an electronic device with a flexible screen.
  • the physical form of the flexible screen includes an unfolded state and a folded state.
  • the flexible screen is divided into a first screen and a second screen.
  • the second screen specifically, the above-mentioned control method includes: when an optional control is displayed on the display interface of the first screen, the electronic device detects a double-sided screen gesture operation for triggering the deletion of the control, and the double-sided screen gesture operation includes The first operation on one screen and the second operation on the second screen, the operation time difference between the first operation and the second operation is less than the preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to For the first selectable control on the first screen, the operation type of the first operation and the second operation are both sliding operations and the sliding direction is the same; in response to the gesture operation on the double-sided screen, the electronic device sets the first selectable on the first screen The control is deleted.
  • the above optional controls refer to the controls that can be deleted on the display interface of the screen.
  • the optional controls can be the conversation box control in the chat list interface, or the folder control in the file list interface, or the mobile phone.
  • the aforementioned sliding direction may be a horizontal direction, a vertical direction, and other directions that have a certain angle with the horizontal direction or the vertical direction, which is not limited in the present application.
  • the electronic device can operate the corresponding double-sided screen according to the user's double-sided screen gesture operation on the flexible screen.
  • Optional controls are deleted.
  • the user can use the double-sided screen gestures to slide the optional controls to the edge of the screen or to a designated area on the screen, so as to quickly delete the optional controls, and can also avoid the user's single-screen sliding to delete the controls by mistake.
  • the display interface of the first screen displays optional controls
  • the second screen is in the black screen state
  • the second screen is in the bright screen state but the second screen is displayed
  • the electronic device detects a double-sided screen gesture operation for triggering the deletion of the control, and in response to the double-sided screen gesture operation, the electronic device deletes the first optional control on the first screen.
  • the electronic device in response to the double-sided screen gesture operation, deletes the first selectable control on the first screen, including: responding to the double-sided screen gesture operation, if the first screen is user-oriented Screen, the electronic device deletes the first selectable control on the first screen.
  • the electronic device also needs to detect the specific orientation of the current electronic device, and further determine whether to delete the first selectable control on the first screen according to the specific orientation of the electronic device. That is, if the electronic device determines that the screen on which the first selectable control to be deleted is located is facing away from the user, the electronic device does not trigger the delete control, thereby avoiding the user's misoperation.
  • the optional controls are displayed on the display interface of the first screen, and the optional controls are displayed on the display interface of the second screen, if the first operation corresponds to the first on the first screen Optional control, and the second operation corresponds to the second optional control on the second screen; in response to the double-sided screen gesture operation, if the first screen faces the user, the electronic device deletes the first optional control on the first screen; If the second screen faces the user, the electronic device deletes the second selectable control on the second screen.
  • the display interfaces of the two screens of the electronic device have optional controls, and the touch operations on the two screens correspond to the optional controls.
  • the electronic device can further determine which screen is on according to the specific orientation of the current electronic device.
  • Optional controls should be deleted. If the user-facing screen is the first screen, the electronic device deletes the optional controls on the first screen. If the user-facing screen is the second screen, the electronic device deletes the optional controls on the second screen. Optional controls are deleted.
  • the method further includes: if there are one or more second selectable controls under the first selectable control. Three optional controls, the electronic device moves one or more third optional controls upward in order.
  • the present application provides an electronic device, the electronic device includes: a flexible screen, one or more processors, one or more memories, one or more sensors; the flexible screen includes a display and a touch device, flexible The physical form of the screen includes the unfolded state and the folded state.
  • the flexible screen is divided into a primary screen and a secondary screen; the above-mentioned memory stores one or more application programs and one or more programs, of which one or more Each program includes instructions, and when the instructions are executed by the electronic device, the electronic device executes the control method described in any one of the above.
  • the present application provides a computer-readable storage medium that stores instructions in the computer-readable storage medium, and when the instructions run on an electronic device, the electronic device executes the control method described in any one of the above.
  • the present application provides a computer program product containing instructions, which when the computer program product runs on an electronic device, causes the electronic device to execute the control method described in any one of the above.
  • the electronic device described in the fifth aspect, the computer-readable storage medium described in the sixth aspect, and the computer program product described in the seventh aspect are all used to execute the corresponding control method provided above. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding control method provided above, which will not be repeated here.
  • FIG. 1 is a first structural diagram of an electronic device provided by an embodiment of the application
  • FIG. 2 is a second structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 3a is a schematic diagram 1 of the architecture of an operating system in an electronic device provided by an embodiment of the application;
  • FIG. 3b is a schematic diagram 2 of the architecture of an operating system in an electronic device provided by an embodiment of the application;
  • FIG. 4 is a schematic diagram of a holding state when a user uses a double-sided screen gesture operation according to an embodiment of the application
  • FIG. 5 is a third structural diagram of an electronic device provided by an embodiment of this application.
  • FIG. 6 is a first flowchart of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application
  • FIG. 7 is a schematic diagram 1 of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 8 is a schematic diagram of a second scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 9 is a second schematic flowchart of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of the third scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 11 is a schematic diagram 4 of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 12 is a third schematic flowchart of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 13a is a schematic diagram 5 of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 13b is a sixth scenario diagram of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application.
  • FIG. 13c is a schematic diagram 7 of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 14 is a fourth schematic flowchart of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 15a is a schematic diagram eight of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • 15b is a schematic diagram 9 of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • 16 is a schematic diagram ten of a scene of a method for controlling an electronic device with a flexible screen provided by an embodiment of the application;
  • FIG. 17 is a fourth structural diagram of an electronic device provided by an embodiment of this application.
  • the embodiment of the application provides a method for controlling an electronic device with a flexible screen, which can be applied to mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital computers.
  • electronic devices with flexible screens such as personal digital assistants (PDAs), wearable devices, and virtual reality devices, the embodiments of the present application do not impose any limitation on this.
  • FIG. 1 shows a schematic diagram of the structure of the mobile phone.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, RF module 150, communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, flexible screen 301, and user identification module (subscriber identification module, SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, RF module 150, communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193,
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the mobile phone 100.
  • the mobile phone 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transceiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may couple the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the mobile phone 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the communication module 160 through an I2S interface, so as to realize the function of answering a call through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the communication module 160.
  • the processor 110 communicates with the Bluetooth module in the communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the flexible screen 301 and the camera 193.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the mobile phone 100.
  • the processor 110 and the flexible screen 301 communicate through a DSI interface to realize the display function of the mobile phone 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the flexible screen 301, the communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transfer data between the mobile phone 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones. This interface can also be used to connect to other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the mobile phone 100.
  • the mobile phone 100 may also adopt different interface connection modes in the above-mentioned embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the mobile phone 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the flexible screen 301, the camera 193, and the communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the radio frequency module 150, the communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the radio frequency module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the mobile phone 100.
  • the radio frequency module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the radio frequency module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the radio frequency module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation by the antenna 1.
  • at least part of the functional modules of the radio frequency module 150 may be provided in the processor 110.
  • at least part of the functional modules of the radio frequency module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the flexible screen 301.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the radio frequency module 150 or other functional modules.
  • the communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellite systems ( Global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS Global navigation satellite system
  • frequency modulation frequency modulation
  • FM near field communication technology
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the communication module 160 may be one or more devices integrating at least one communication processing module.
  • the communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the mobile phone 100 is coupled with the radio frequency module 150, and the antenna 2 is coupled with the communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the mobile phone 100 implements a display function through a GPU, a flexible screen 301, and an application processor.
  • the GPU is an image processing microprocessor, which connects the flexible screen 301 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the flexible screen 301 may include a display and a touch device.
  • the display is used to output display content to the user, and the touch device is used to receive touch events input by the user on the flexible screen 301. It should be understood that the touch device involved in this article can be integrated with the display in a flexible screen, and the touch device is connected to the display; the touch device can also be set independently and connected to the display of the flexible screen.
  • the physical form of the flexible screen 301 can be divided into a folded state, a bracket state, and an unfolded state. In other embodiments, the physical form of the flexible screen 301 can be divided into only the folded state and the unfolded state.
  • the flexible screen 301 can be displayed as a complete display area in the unfolded state, and the user can fold the screen along one or more folding lines in the flexible screen 301.
  • the position of the folding line may be preset, or may be arbitrarily selected by the user in the flexible screen 301.
  • the flexible screen 301 can be divided into two display areas along the AB fold line.
  • the two folded display areas can be displayed as two independent display areas.
  • the display area on the right side of the fold line AB can be referred to as the main screen 11 of the mobile phone 100
  • the display area on the left side of the fold line AB can be referred to as the secondary screen 12 of the mobile phone 100.
  • the display areas of the main screen 11 and the sub screen 12 may be the same or different.
  • main screen and the secondary screen here only distinguish the display areas on both sides, and do not represent the importance or primary and secondary of the screen; the primary and secondary screens can also be called the first screen and the second screen, respectively.
  • the embodiment of the present invention does not limit this.
  • the mobile phone 100 can calculate the angle between the main screen and the secondary screen using data detected by one or more sensors (for example, a gyroscope and an acceleration sensor). It can be understood that the included angle ⁇ between the main screen 11 and the auxiliary screen 12 is within a closed interval formed by 0 to 180°. When the included angle ⁇ between the main screen 11 and the secondary screen 12 is greater than the first threshold (for example, ⁇ is 170°), the mobile phone 100 can determine that the flexible screen 301 is in an unfolded state, as shown in (a) in FIG. 2.
  • the first threshold for example, ⁇ is 170°
  • the mobile phone 100 can determine that the flexible screen 301 is in the bracket state, as shown in FIG. 2 (b). Or, when the included angle ⁇ between the main screen 11 and the secondary screen 12 is less than the second threshold (for example, ⁇ is 20°), the mobile phone 100 may determine that the flexible screen 301 is in a folded state, as shown in (c) in FIG. 2.
  • a preset interval for example, ⁇ is between 40° and 60°
  • the mobile phone 100 can determine that the flexible screen 301 is in the bracket state, as shown in FIG. 2 (b).
  • the included angle ⁇ between the main screen 11 and the secondary screen 12 is less than the second threshold (for example, ⁇ is 20°)
  • the mobile phone 100 may determine that the flexible screen 301 is in a folded state, as shown in (c) in FIG. 2.
  • the mobile phone 100 when the angle ⁇ between the main screen 11 and the secondary screen 12 is greater than the third threshold (for example, 45° or 60°), the mobile phone 100 can determine the flexible screen 301 is in the unfolded state; when the included angle ⁇ between the main screen 11 and the secondary screen 12 is less than the third threshold, the mobile phone 100 can determine that the flexible screen 301 is in the folded state.
  • the third threshold for example, 45° or 60°
  • the main screen and the secondary screen can be arranged oppositely, or the main screen and the secondary screen can also deviate from each other.
  • the main screen and the sub screen are separated from each other. At this time, both the main screen and the sub screen are exposed to the external environment. The user can use the main screen to display or the sub screen. To display.
  • the bent part of the screen (also called the side screen) can also be used as an independent display area.
  • the flexible screen 301 is divided into three independent display areas: the main screen, the secondary screen, and the side screen.
  • the mobile phone 100 can determine according to the physical form and display state of the flexible screen 301 and the user's touch operation on the main screen 11 and the secondary screen 12 Whether to trigger screenshots, multiple selection of controls, icon movement, response to control deletion, etc.
  • the above-mentioned sensor module 180 may include a gyroscope, an acceleration sensor, a pressure sensor, an air pressure sensor, a magnetic sensor (such as a Hall sensor), a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, a pyroelectric infrared sensor, and the environment
  • a gyroscope an acceleration sensor, a pressure sensor, an air pressure sensor, a magnetic sensor (such as a Hall sensor), a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, a pyroelectric infrared sensor, and the environment
  • a gyroscope an acceleration sensor
  • a pressure sensor such as a pressure sensor
  • an air pressure sensor such as a Hall sensor
  • a magnetic sensor such as a Hall sensor
  • a distance sensor such as a Hall sensor
  • a proximity light sensor such as a proximity light sensor
  • a fingerprint sensor such as a temperature sensor
  • the mobile phone 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a flexible screen 301, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the mobile phone 100 may support one or more video codecs. In this way, the mobile phone 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the mobile phone 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 answers a call or a voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the mobile phone 100 may be provided with at least one microphone 170C. In other embodiments, the mobile phone 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the mobile phone 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the mobile phone 100 can receive key input, and generate key signal input related to user settings and function control of the mobile phone 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the flexible screen 301 the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be connected to and separated from the mobile phone 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195.
  • the mobile phone 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the mobile phone 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the mobile phone 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
  • the above-mentioned software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100 by way of example.
  • Fig. 3a is a block diagram of the software structure of the mobile phone 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages. As shown in Figure 3a, applications such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, short message, etc. can be installed in the application layer.
  • applications such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, short message, etc. can be installed in the application layer.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include an input manager service (IMS).
  • IMS input manager service
  • the application framework layer can also include display policy service, power manager service (PMS), display management service (display manager service, DMS), activity manager, window manager, content provider, view system , Phone manager, resource manager, notification manager, etc.
  • PMS power manager service
  • DMS display management service
  • activity manager window manager
  • content provider view system
  • Phone manager resource manager
  • notification manager etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: condition monitoring service, surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the status monitoring service is used to determine the specific orientation of the mobile phone and the physical status of the flexible screen according to the monitoring data reported by the kernel layer.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a sensor driver, a TP driver, a camera driver, an audio driver, etc., which are not limited in the embodiment of the present application.
  • the system library and kernel layer below the application framework layer can be referred to as the underlying system.
  • the underlying system includes the underlying display system for providing display services.
  • the underlying display system includes the display in the kernel layer.
  • the underlying system in this application also includes a status monitoring service for identifying changes in the physical form of the flexible screen.
  • the status monitoring service can be independently set in the underlying display system, or in the system library and/or kernel layer.
  • the condition monitoring service may call a sensor service to start sensors such as a gyroscope and an acceleration sensor for detection.
  • the condition monitoring service can calculate the angle between the current main screen and the secondary screen based on the detection data reported by each sensor. In this way, through the angle between the main screen and the secondary screen, the state monitoring service can determine that the flexible screen is in the unfolded state, folded state, or stand state, and other physical forms.
  • the status monitoring service can report the determined physical form to the aforementioned input management service.
  • the status monitoring service when the status monitoring service determines that the current mobile phone is in a folded state or a cradle state, the status monitoring service can also activate sensors such as a camera, infrared sensor, proximity sensor, or touch panel (TP) to recognize the mobile phone.
  • sensors such as a camera, infrared sensor, proximity sensor, or touch panel (TP) to recognize the mobile phone.
  • TP touch panel
  • Specific orientation For example, the specific orientation of the mobile phone may include the main screen facing the user or the secondary screen facing the user.
  • the main screen or the sub screen or the entire flexible screen referred to in this article facing the user includes: the main screen or the sub screen or the entire flexible screen is facing the user at a substantially parallel angle to the user's face, and also includes the main screen or the sub screen or the entire flexible screen. Facing the user at a certain oblique angle.
  • the input management service can be used to obtain the physical form, display state, screen touch data, specific orientation of the mobile phone, etc. from the underlying system when a touch event on the screen is received.
  • the screen touch data includes the coordinate position of the touch point on the primary or secondary screen, touch time, and pressure value.
  • the input management service can determine whether to trigger custom double-sided screen gesture events based on the physical form, display state, and screen touch data of the flexible screen, such as double-sided screen screenshots, multiple selection of controls, icon movement, and control deletion events .
  • the input management service can call the system interface to report the double-sided screen gesture event to the upper-layer application, so that the upper-layer application completes the corresponding operation.
  • the above-mentioned upper-layer application refers to a registered application preset by the user that can respond to gesture events of the double-sided screen.
  • the upper-level application may be an application that comes with the mobile phone system, such as an album application, an address book application, a browser application, etc., or any third-party application, which is not limited in this application. Users can set functions such as double-sided screen gestures to trigger screenshots, multiple selection of controls, icon movement, or deletion of controls for any of the above applications according to their own needs.
  • Fig. 3b it is a schematic diagram of the data flow inside the Android operating system.
  • the gyroscope and acceleration sensor of the hardware layer can report the detected data to the sensor driver.
  • the sensor driver reports the data detected by the gyroscope and acceleration sensor to the status monitoring service through the sensor service.
  • the status monitoring service can report the detected data to the sensor driver.
  • the data detected by the acceleration sensor determine the angle between the main screen and the secondary screen, and then determine the physical form of the flexible screen.
  • the touch device at the hardware layer can report the detected screen touch data to the status monitoring service through the TP driver
  • the camera at the hardware layer can report the detected data to the status monitoring service through the camera driver
  • the infrared sensor at the hardware layer can pass
  • the infrared driver reports the detected data to the condition monitoring service.
  • the condition monitoring service can determine the specific orientation of the mobile phone based on the data reported by the touch device, camera or infrared sensor.
  • the status monitoring service can also obtain the display parameters of the flexible screen through the display driver, so as to determine the display status of the flexible screen.
  • the state monitoring service can report the determined physical form, display state, specific orientation of the mobile phone, and screen touch data to the input management service, and the input management service will finally determine whether to trigger a custom double-sided screen gesture event .
  • the flexible screen of the mobile phone is in a folded state.
  • FIG. 4 for the holding state when the user uses the double-sided screen gesture operation.
  • the user holds the mobile phone in a folded state with one hand (left or right hand), and the user can use the thumb and index finger of one hand to cooperate to achieve a double-sided screen gesture operation.
  • the user uses the right thumb on the main screen 11 to trigger the first operation, and uses the right index finger on the secondary screen 12 to trigger the second operation, the operation time difference between the first operation and the second operation is less than the preset time difference, and the first operation
  • the mobile phone can determine that the gesture operation is a double-sided screen gesture operation.
  • the operation time of the user's triggering operation on the screen refers to the initial moment when the user's finger or palm touches the screen.
  • the operation time difference between the first operation and the second operation mentioned above refers to the time difference between the initial moment when the user's finger or palm touches the main screen and the initial moment when the user's finger or palm touches the secondary screen.
  • the preset time difference can be set to 500 ms or less.
  • the correspondence between the touch position of the first operation and the touch position of the second operation can be understood as the position difference between the touch positions of the first operation and the second operation is smaller than the preset position difference. It can be judged in any of the following ways:
  • the mobile phone can determine the position relationship between the coordinate origin of the primary screen coordinate system and the secondary screen coordinate system according to the relative position relationship between the main screen and the secondary screen, and according to the position relationship of the coordinate origin, the second operation is performed on the secondary screen
  • the coordinate position of the touch point is converted to the main screen coordinate system
  • the position difference between the touch point of the first operation on the main screen and the touch point of the second operation converted to the main screen is calculated, and it is judged whether the position difference is smaller than the preset Set location difference.
  • the preset position difference can be set to 10dp.
  • Figure 5 shows the flexible screen in a folded state.
  • the mobile phone detects the user's first operation on the main screen 11 and a second operation on the secondary screen 12.
  • the touch point of the first operation on the main screen 11 is touch Point 1
  • the touch point for the second operation on the secondary screen 12 is touch point 2.
  • the mobile phone can switch the touch point 2 on the secondary screen 12 to the primary screen 11 according to the relative position relationship between the primary screen 11 and the secondary screen 12 , Touch point 2 corresponds to touch point 2'.
  • the mobile phone can determine the position difference r according to the positions of the touch point 1 and the touch point 2'on the main screen 11, and then determine whether the position difference r is less than the preset position difference. If the position difference r is less than the preset position difference, the mobile phone can determine The touch positions of the first operation and the second operation correspond to each other.
  • the mobile phone can determine the position relationship between the coordinate origin of the primary screen coordinate system and the secondary screen coordinate system according to the relative position relationship between the main screen and the secondary screen. According to the position relationship of the coordinate origin, the first operation on the main screen The coordinate position of the touch point is converted to the coordinate system of the secondary screen, and the position difference between the touch point converted from the first operation to the secondary screen and the touch point of the second operation on the secondary screen is calculated to determine whether the position difference is Less than the preset position difference.
  • the mobile phone can also convert the coordinate position of the touch point of the first operation on the main screen and the coordinate position of the touch point of the second operation on the secondary screen to a designated coordinate system.
  • the designated coordinate system is Different from the coordinate system of the primary screen coordinate system and the secondary screen coordinate system, the mobile phone calculates the position difference between the coordinate position converted to the specified coordinate system by the first operation and the coordinate position converted to the specified coordinate system by the second operation, and judges whether the position difference is less than The preset position is poor.
  • a mobile phone is taken as an example of an electronic device, and a method for controlling a screenshot operation of a double-sided screen provided by an embodiment of the present application will be described in detail with reference to the drawings.
  • the mobile phone when the flexible screen of the mobile phone is in the folded state and the flexible screen is in the on-screen state, the mobile phone detects a double-sided screen gesture operation for triggering screenshots.
  • the double-sided screen gesture operation includes the first touch to the main screen.
  • the mobile phone takes a screenshot of the flexible screen in the on-screen state.
  • the above-mentioned flexible screen in the bright screen state includes the following three situations: the main screen is in the bright screen state and the secondary screen is in the black screen state; the main screen is in the black screen state and the secondary screen is in the bright screen state; both the main screen and the secondary screen are in the bright screen state.
  • the mobile phone determines the main screen and the second operation after detecting the first operation on the main screen and the second operation on the secondary screen. Does the touch operation on the secondary screen meet at least the following two conditions:
  • the operation time difference between the first operation and the second operation is less than the preset time difference
  • the touch positions of the first operation and the second operation correspond to each other.
  • the mobile phone triggers a screenshot of the corresponding screen.
  • the touch operation on the main screen and the secondary screen should also satisfy that the operation types of the first operation and the second operation are the same, as shown in FIG. 6.
  • the mobile phone needs to determine that the above three conditions are met before triggering the screenshot, otherwise it will not trigger the screenshot.
  • the execution order of judging the above three conditions can be executed simultaneously or sequentially.
  • the execution order of sequential execution is not limited to the above execution order. This embodiment does not make any difference to the execution order of the above three conditions. limited.
  • the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • the above-mentioned pressing operation is also called pressure-sensitive operation, and should meet the following conditions: the operation duration of the user's touch operation on the screen is greater than or equal to the preset duration, the coordinate position of the touch point of the touch operation does not change, and the touch operation
  • the pressing value of the operation is greater than or equal to the preset pressure value.
  • the above-mentioned click operation should satisfy: the operation duration of the user's touch operation on the screen is less than the preset duration.
  • the above-mentioned double-click operation should satisfy: the user triggers two click operations on the screen, and the operation time interval of the two click operations is less than the preset time interval.
  • the aforementioned long press operation should satisfy the following conditions: the operation duration of the user's touch operation on the screen is greater than or equal to the preset duration, and the pressing value of the touch operation is less than the preset pressure value.
  • the mobile phone when the mobile phone determines to trigger a screenshot, it may further determine the on-screen state of the flexible screen before the user triggers the double-sided screen gesture operation, and take a screenshot of one screen or the entire screen in the on-screen state.
  • the mobile phone when the main screen is in the bright screen state and the secondary screen is in the black screen state, the mobile phone can respond to the double-sided screen gesture operation to take screenshots of the main screen; when the main screen is in the black screen state and the secondary screen is in the bright screen state, The mobile phone can take screenshots of the secondary screen in response to gesture operations on the double-sided screen; when both the main screen and the secondary screen are in the on-screen state, the mobile phone can take screenshots of the entire flexible screen in response to gesture operations on the double-sided screen.
  • the user does not need to call the notification bar or operation controls (such as virtual buttons) on the phone screen, and then click the corresponding screenshot control to capture the screen content.
  • the user can directly use the above-mentioned double-sided screen gesture operations to take screenshots of the flexible screen .
  • the mobile phone can intelligently recognize the current physical form of the flexible screen (folded or unfolded), display status (bright screen or black screen), and double-sided screen gesture operations, and take corresponding screenshots to provide users with a better experience.
  • the flexible screen of the mobile phone is in the folded state, the main screen is in the bright screen state, and the secondary screen is in the black screen state.
  • the user can take a screenshot of the main screen in the bright screen state through a double-sided screen gesture operation.
  • the mobile phone includes a flexible screen
  • the flexible screen may be divided into a main screen 11 and a sub screen 12, and the flexible screen is currently in a folded state.
  • the main screen 11 faces the user and is in a bright screen state
  • the main screen 11 currently displays the main interface of the mobile phone
  • six application icons are displayed on the main interface
  • the secondary screen 12 faces the user and is in a black screen state.
  • the condition monitoring service After the condition monitoring service receives the touch event, it can call the sensor service to start the sensor corresponding to the main screen and the sensor corresponding to the secondary screen for detection.
  • the condition monitoring service can determine the current physical form of the flexible screen as the folded state based on the detection data.
  • the status monitoring service After the status monitoring service receives the touch event, it can also obtain the display parameters of the main screen and the secondary screen through the display driver to determine the display state of the flexible screen before the touch event occurs. For example, the main screen 11 shown in Figure 7 is in When the screen is on, the secondary screen 12 is in a black screen state.
  • the status monitoring service can obtain the touch data corresponding to the touch event through the touch device.
  • the touch data includes the specific position of the touch point on the main screen and the secondary screen, the operation time (the starting time of the operation), the operation time, and the pressing Strength and so on.
  • the status monitoring service can send to the input management service in the application framework layer that the physical form of the current flexible screen is in the folded state, and the display state of the flexible screen before the touch event is the main screen is in the bright screen state, and the secondary screen is in the black screen state. , And the touch data on the main screen and the sub screen.
  • the input management service can determine whether to take a corresponding screenshot according to the physical form, display state, and touch data of the flexible screen. For example, when the flexible screen is in the folded state, the main screen is in the bright screen state, and the secondary screen is in the black screen state, the operation time difference between the first operation on the main screen and the second operation on the secondary screen is less than the preset time difference, the first operation and the second operation When the touch position corresponds to the touch position and the operation type of the first operation and the second operation is a pressing operation, the input management service can determine that a screenshot of the main screen is currently needed. Subsequently, the input management service sends a screenshot event for taking a screenshot of the main screen to the upper application.
  • the upper-level application takes a screenshot of the main interface currently displayed on the main screen. Subsequently, the main screen 11 displays a screenshot preview interface.
  • the screenshot preview interface includes processing controls that can operate on the screenshot, such as the one shown in Figure 7. With "Save”, “Edit”, “Cancel” and other processing controls, the user can save, edit or delete screenshots of the main screen on the main screen 11.
  • the above processing control is only an example, and may also include other processing controls, such as a sharing control, etc., which is not limited in the embodiment of the present application.
  • the mobile phone can automatically take screenshots of the main screen, which is convenient for users to perform quick screenshot operations on the foldable flexible screen, and enhance the user experience when using a folding mobile phone.
  • the flexible screen of the mobile phone is in the folded state, the main screen is in the black screen state, and the secondary screen is in the bright screen state.
  • the user can take a screenshot of the secondary screen in the bright screen state through gesture operations on the double-sided screen.
  • the implementation principle and technical effect are similar to the first scenario of this embodiment, and will not be repeated here.
  • the flexible screen of the mobile phone is in the folded state, and the main screen and the secondary screen are both in the bright screen state.
  • the user can take a screenshot of the entire flexible screen in the bright screen state through gesture operations on the double-sided screen.
  • the flexible screen is currently in a folded state
  • the main screen 11 of the flexible screen faces the user and is in a bright screen state
  • the main screen 11 currently displays a mobile phone message list interface
  • the secondary screen 12 faces the user and is in a bright screen state.
  • the secondary screen 12 currently displays the chat interface with the contact Bob in the short message list of the mobile phone.
  • the input management service can be Determine the current need to take a screenshot of the entire flexible screen. Subsequently, the input management service sends a screenshot event for taking a screenshot of the entire screen to the upper application.
  • the upper-level application takes screenshots of the entire flexible screen display interface (the display interfaces of the main screen 11 and the secondary screen 12). Similar to the first scenario, the main screen 11 displays a screenshot preview interface, and the screenshot preview interface includes The processing control for operating the screenshot, the user can save, edit or delete the screenshot of the entire flexible screen on the main screen 11.
  • the mobile phone can display the screenshot preview interface on the screen in the bright screen state (main screen or secondary screen).
  • the screenshot preview interface may include at least one of the following processing controls: save control, edit control, share control, or cancel control.
  • the selected processing control can be executed on the screenshot. Processing, such as saving, editing, sharing, or canceling operations.
  • the mobile phone can determine whether to trigger a screenshot according to the physical form, display state, touch data of the current flexible screen, and the specific orientation of the mobile phone. Specifically, when the mobile phone determines that the flexible screen is in the folded state, the main screen and the secondary screen are both in the bright screen state, and the first operation on the main screen and the second operation on the secondary screen are double-sided screen gesture operations, the mobile phone can further follow the current mobile phone Take a screenshot of the corresponding screen for the specific orientation of the mobile phone. For example, the current user-facing screen is the main screen. Even if the entire flexible screen is in the on-screen state, the mobile phone only takes a screenshot of the user-facing main screen.
  • the mobile phone can also call sensors such as camera, infrared sensor, proximity light sensor or touch device to identify the specific orientation of the mobile phone. For example, you can install cameras on the main screen and the secondary screen of the mobile phone. If the camera on the main screen captures facial information, but the camera on the secondary screen does not capture the facial information, the status monitoring service can determine that the flexible screen in the current folded state is in The state where the main screen faces the user. For another example, infrared sensors can be installed on the main screen and the secondary screen of the mobile phone.
  • sensors such as camera, infrared sensor, proximity light sensor or touch device to identify the specific orientation of the mobile phone. For example, you can install cameras on the main screen and the secondary screen of the mobile phone. If the camera on the main screen captures facial information, but the camera on the secondary screen does not capture the facial information, the status monitoring service can determine that the flexible screen in the current folded state is in The state where the main screen faces the user.
  • infrared sensors can be installed on the main screen and the secondary screen of the mobile
  • the status monitoring service is available. It is determined that the flexible screen in the current folded state is in a state where the secondary screen faces the user.
  • the mobile phone may also use a preset grasping algorithm to determine the user's current grasping posture of the mobile phone according to the user's touch position on the main screen and/or the secondary screen reported by the touch device. Then, combined with the gripping posture of the mobile phone, the state monitoring service can also determine the specific orientation of the mobile phone.
  • the touch device may report the coordinates of the touch point to the state monitoring service.
  • the state monitoring service determines the gripping posture of the mobile phone by counting the position and number of touch points in the touch device. For example, if it is detected that the number of touch points falling on the main screen is greater than the preset value, indicating that the user's fingers and palm are grasped on the main screen, the screen facing the user at this time is the secondary screen. Correspondingly, if it is detected that the number of touch points falling on the secondary screen is greater than the preset value, it means that the user's fingers and palm are grasped on the secondary screen, and the screen facing the user is the primary screen at this time.
  • the mobile phone can simultaneously use one or more of the above-mentioned sensors to identify the specific orientation of the mobile phone.
  • the mobile phone may also respond to any of the following operations: the user double-clicks the power button, the user double-clicks the virtual button on the screen, the user simultaneously presses the power button and the volume button, etc., triggering the corresponding screen Screenshot of.
  • the mobile phone takes a screenshot of the display interface of the screen (the main screen, the secondary screen, or the entire flexible screen) in the bright screen state.
  • the physical state of the flexible screen that is, the physical form of the flexible screen can be an unfolded state, a folded state, or a bracket state.
  • Table 1 shows that the mobile phone determines whether to trigger a screenshot based on the physical form of the flexible screen, the display state, the touch data on the main screen and the secondary screen, and the specific orientation of the mobile phone.
  • not triggering screenshots in Table 1 can be understood as the mobile phone does not respond (that is, the mobile phone does not perform any processing), or the mobile phone performs processing according to a single-screen gesture operation.
  • the mobile phone when it is determined that the flexible screen of the mobile phone is in the folded state, the main screen is on, the secondary screen is black, the position of the touch position of the first operation on the primary screen and the second operation on the secondary screen is different, and both are pressing operations, if the first The operation time difference between the operation and the second operation is greater than or equal to the preset time difference.
  • the mobile phone can perform corresponding event processing according to the user's operation on the main screen and the secondary screen.
  • the user presses the application icon on the main interface of the main screen, and the mobile phone can respond to the main screen.
  • the pressing operation of the application icon displays the shortcut function of the application icon; the user presses the secondary screen in the black screen state, and the mobile phone can wake up the secondary screen in response to the pressing operation of the secondary screen.
  • the display interface of the screen may be a lock screen interface, a specific interface in an application (such as WeChat, photo album, etc.), and the main screen. Interfaces, etc., the embodiments of this application do not impose any restrictions on this.
  • a mobile phone is taken as an example of an electronic device, and a method for controlling multiple selection operations of a double-sided screen provided by an embodiment of the present application will be described in detail with reference to the accompanying drawings.
  • the mobile phone when the flexible screen of the mobile phone is in a folded state, and the display interface of the main screen of the flexible screen displays at least two selectable controls, the mobile phone detects a double-sided screen gesture operation for triggering multiple selections.
  • the gesture operation includes a first operation on the main screen and a second operation on the secondary screen.
  • the operation time difference between the first operation and the second operation is less than the preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation;
  • Double-sided screen gesture operation the mobile phone controls the display interface of the main screen to enter the multi-select mode.
  • the multi-selection mode at least two optional controls on the display interface of the main screen can all be selected.
  • the above-mentioned optional controls refer to controls that can be selected and processed on the display interface of the screen.
  • the optional control can be the picture control on the picture browsing interface in the album.
  • the user can select multiple picture controls at the same time to splice multiple pictures (ie picture editing), or send multiple pictures to friends in batches ( That is, picture sharing), or add multiple pictures to a new album folder (ie, picture organization).
  • the above-mentioned multi-selection mode refers to an operation mode in which the user can select at least two optional controls on the display interface of the screen and perform batch processing.
  • the mobile phone detects the first operation on the main screen and the operation on the secondary screen. After the second operation, according to the touch position and operation time of the first operation and the second operation on the screen, it can be judged whether the touch operation on the primary screen and the secondary screen meets at least the following two conditions:
  • the operation time difference between the first operation and the second operation is less than the preset time difference
  • the touch positions of the first operation and the second operation correspond to each other.
  • the mobile phone controls the display interface of the main screen to enter the multi-select mode.
  • the touch operation on the main screen and the secondary screen should also satisfy that the operation types of the first operation and the second operation are the same, as shown in FIG. 9.
  • the mobile phone needs to determine that the above three conditions are all met before triggering the display interface of the main screen to enter the multi-selection mode, otherwise the display interface of the main screen does not enter the multi-selection mode.
  • execution order of judging the above three conditions can be executed simultaneously or sequentially.
  • the execution order of sequential execution is not limited to the above execution order. This embodiment does not make any difference to the execution order of the above three conditions. limited.
  • the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • pressing operation any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • clicking operation clicking operation
  • double-clicking operation double-clicking operation
  • long-pressing operation any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • the mobile phone when the mobile phone determines to trigger the display interface of the main screen to enter the multi-select mode, it may further determine whether the touch position of the user's first operation on the main screen corresponds to the first selectable control, as shown in FIG. The touch position of an operation corresponds to the first selectable control.
  • the first selectable control is one of the selectable controls displayed on the main screen.
  • the mobile phone can control the display interface of the main screen to enter the multi-select mode, and The first selectable control is displayed as selected. The user can directly select an optional control on the main screen while triggering the display interface of the main screen to enter the multi-selection mode through the above method.
  • the display interface of the main screen of the flexible screen displays at least two optional controls, including the following situations:
  • the display interface of the main screen of the flexible screen displays at least two optional controls (ie The main screen is in the bright screen state), and the secondary screen of the flexible screen is in the black screen state.
  • the display interface of the main screen of the flexible screen displays at least two optional controls, and the secondary screen of the flexible screen is in the on-screen state, but the display interface of the secondary screen has no optional controls.
  • at least two optional controls are displayed on the display interfaces of the main screen and the secondary screen of the flexible screen.
  • the folded mobile phone detects a double-sided screen gesture operation for triggering multiple selections.
  • the mobile phone can control the display interface of the main screen to enter the multiple selection mode.
  • the folded mobile phone detects a double-sided screen gesture operation for triggering multiple selections.
  • the mobile phone can control the display interface of the user-facing screen to enter the multi-selection mode.
  • the user's screen is the primary or secondary screen.
  • the mobile phone can determine which screen display interface enters the multi-select mode according to the specific orientation of the current mobile phone. If the user-facing screen is the main screen, the mobile phone The display interface of the control main screen enters the multi-select mode. If the user-facing screen is the secondary screen, the mobile phone controls the display interface of the secondary screen to enter the multi-select mode.
  • the mobile phone in response to the double-sided screen gesture operation, controls the display interface of the main screen to enter the multi-select mode, including: in response to the double-sided screen gesture operation, if the main screen is a user-facing screen, the mobile phone controls the display interface of the main screen to enter Multiple selection mode.
  • the mobile phone also needs to detect the specific orientation of the current mobile phone, and further determine whether to control the display interface of the main screen to enter the multi-selection mode according to the specific orientation of the mobile phone. In other words, if the screen to be controlled by the mobile phone is facing away from the user, the mobile phone does not trigger the corresponding control, thereby avoiding the user's misoperation.
  • the mobile phone if the mobile phone is in the folded state, at least two optional controls are displayed on the display interface of the main screen, and the secondary screen is in the black screen state, or the secondary screen is in the bright screen state but there are no optional controls on the display interface, and the mobile phone detects For gesture operations on the double-sided screen, you need to confirm the specific orientation of the current phone. If the user-facing screen is not the main screen that displays optional controls, the phone will not trigger the main screen's display interface to enter the multi-select mode.
  • the method in response to a double-sided screen gesture operation, after the mobile phone controls the display interface of the main screen to enter the multi-selection mode, the method further includes: the mobile phone detects a third operation for triggering selection of selectable controls on the display interface of the main screen
  • the operation type of the third operation is a click operation or a sliding operation; in response to the third operation, the mobile phone displays one or more controls corresponding to the third operation as selected.
  • the user can add one or more optional controls through this implementation method to complete multiple selection operations.
  • the mobile phone when the user needs to perform batch processing on at least two optional controls on the display interface of the folded flexible screen, the mobile phone can control the display interface of the flexible screen of the mobile phone to quickly enter according to the detected double-sided screen gesture operation Multiple selection mode. In this way, the user does not need to enter the multi-selection mode by clicking the "select" control on the designated position (for example, the upper left corner of the screen).
  • the above-mentioned double-sided screen gesture operation is more convenient and faster, and provides a better user experience for users.
  • the flexible screen of the mobile phone is in the folded state, at least two optional controls are displayed on the display interface of the main screen (the main screen is in the bright screen state), and the secondary screen is in the black screen state, the user can use double-sided screen gestures
  • the operation triggers the display interface of the main screen to enter the multi-select mode, thereby completing the multi-select operation.
  • the flexible screen of the mobile phone is divided into a main screen 11 and a sub screen 12, and the flexible screen is currently in a folded state.
  • the main screen 11 faces the user and is in a bright screen state
  • the current display interface of the main screen 11 is a picture browsing interface, on which 9 picture controls are displayed
  • the secondary screen 12 faces the user and is in a black screen state.
  • the touch device on the flexible screen detects the user's first operation on the main screen and the second operation on the secondary screen, similar to the above-mentioned embodiment, the status monitoring service obtains the physical form and display of the current flexible screen.
  • the physical form and display status of the current flexible screen and the touch data on the primary and secondary screens can be sent to the input management service in the application framework layer.
  • the input management service can determine whether to trigger the display interface of the main screen to perform a multi-select mode according to the physical form, display state, and touch data of the flexible screen. For example, when the flexible screen is in the folded state, at least two optional controls are displayed on the display interface of the main screen, the secondary screen is in a black screen state, and the operation time difference between the first operation on the main screen and the second operation on the secondary screen is less than the preset time difference , When the touch positions of the first operation and the second operation correspond, and the operation type of the first operation and the second operation is a click operation, the input management service can determine that the display interface of the current main screen needs to enter the multi-select mode.
  • the input management service sends an event for instructing the display interface of the main screen to enter the multi-selection mode to the upper application.
  • the input management service may further determine whether the first operation on the main screen corresponds to an optional control on the display interface of the main screen. If the first operation corresponds to an optional control, as shown in FIG. 10, the first operation Corresponding to the picture control in the second row and first column of the picture browsing interface, the upper application triggers the display interface of the main screen to enter the multi-select mode according to the event instructions of the input management service, and check the picture selected by the user on the main screen.
  • the display interface of the main screen can quickly enter the multi-select mode. If the user’s first operation on the main screen also corresponds to an optional control, the mobile phone The optional controls selected by the user on the main screen will be checked at the same time, so that users can quickly perform multiple selection operations on the folded flexible screen, and improve the user experience when using a folding mobile phone.
  • the flexible screen of the mobile phone is in the folded state, at least two optional controls are displayed on the display interface of the secondary screen (the secondary screen is in the bright state), and the main screen is in the black state, the user can use the double-sided screen Gesture operations perform multiple selection operations on the optional controls on the secondary screen in the on-screen state.
  • the implementation principle and technical effect are similar to the first scenario of this embodiment, and will not be repeated here.
  • the flexible screen of the mobile phone is in a folded state, and at least two optional controls are displayed on the display interface of the main screen and the secondary screen.
  • the mobile phone responds to the gesture operation of the double-sided screen to control the display interface of the user-oriented screen. Enter the multi-selection mode, the user can perform multi-selection operations in the multi-selection mode.
  • the flexible screen is currently in a folded state
  • the main screen 11 of the flexible screen faces the user and the main screen 11 currently displays the main interface.
  • the sub screen 12 faces the user and the sub screen 12 faces away from the user.
  • Screen 12 currently displays the album browsing interface, and there are 12 picture controls on the album browsing interface.
  • the input management service can determine which screen's display interface enters the multi-select mode according to the physical form, display state, touch data of the flexible screen, and the specific orientation of the mobile phone.
  • the current user-oriented screen is the main screen 11.
  • the mobile phone controls the display interface of the main screen 11 to enter the multi-select mode. Since the first operation on the main screen 11 corresponds to the application icon 4, the display interface of the main screen 11 enters the multi-select mode. While in the mode, the application icon 4 is selected at the same time. At this time, the secondary screen 12 facing away from the user does not respond (the display interface remains unchanged).
  • Table 2 shows the response of the mobile phone to determine whether to trigger the interface to enter the multi-selection mode according to the physical form of the flexible screen, the display state, the touch data on the main screen and the secondary screen, and the specific orientation of the mobile phone.
  • the user is not required to correspond to the touch operation on the main screen and the secondary screen.
  • Optional controls on the interface that is, touch operations can be in a blank position on the display interface. If the touch operation on the main screen or the auxiliary screen corresponds to an optional control on the interface, the display interface of the main screen or the auxiliary screen enters the multi-select mode at the same time, the optional control can be directly displayed as selected.
  • the prerequisite for triggering the display interface of the screen to enter the multi-selection mode in the embodiment of the present application is that at least two multi-select controls are displayed on the display interface of the screen (main screen and/or secondary screen).
  • the display interface of the screen can be the specific interface of a certain application, such as the picture browsing interface of the photo album application, which displays multiple picture controls, or the main interface of the mobile phone, with multiple application icon controls displayed on the main interface .
  • a mobile phone is taken as an example of an electronic device, and a method for controlling the movement operation of icons on a double-sided screen provided by an embodiment of the present application will be described in detail with reference to the accompanying drawings.
  • the display interfaces of the main screen and the sub-screen of the flexible screen are both interfaces that can accommodate icon controls (both the main screen and the sub-screen are in the bright screen state), and the display interface of the main screen
  • the mobile phone detects a double-sided screen gesture operation for triggering the movement of the icon.
  • the double-sided screen gesture operation includes a first operation on the main screen and a second operation on the secondary screen.
  • the first operation and the second operation The operation time difference between the two operations is less than the preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to the first icon control on the main screen.
  • the mobile phone moves the first icon control from the main screen to the secondary screen.
  • the aforementioned icon controls are icon controls that can be moved, and the icon controls on the display interface can be arranged in the form of nine-square grid, sixteen-square grid, and the like.
  • the aforementioned icon control may be an icon control of an application program on the main interface of the mobile phone, or an icon control in a certain application, which is not limited in this application.
  • the display interfaces of the main screen and the secondary screen of the flexible screen are both interfaces that can accommodate icon controls, and when the display interface of the main screen displays icon controls, the mobile phone After detecting the first operation on the main screen and the second operation on the secondary screen, it can be determined whether the touch operation on the primary screen and the secondary screen is at least based on the touch position and operation time of the first operation and the second operation on the screen. Meet the following three conditions:
  • the operation time difference between the first operation and the second operation is less than the preset time difference
  • the first operation corresponds to the first icon control on the main screen.
  • the mobile phone will move the first icon control from the main screen to the secondary screen.
  • the touch operation on the main screen and the secondary screen should also satisfy that the operation types of the first operation and the second operation are the same, as shown in FIG. 12. That is to say, the mobile phone needs to determine that the above four conditions are all met before triggering the movement of the first icon control from the main screen to the secondary screen.
  • the execution order of judging the above four conditions can be executed simultaneously or sequentially.
  • the execution order of sequential execution is not limited to the above execution order. This embodiment does not make any difference to the execution order of the above four conditions. limited.
  • the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • pressing operation any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • clicking operation clicking operation
  • double-clicking operation double-clicking operation
  • long-pressing operation any one of the following: pressing operation, clicking operation, double-clicking operation, and long-pressing operation.
  • the mobile phone in response to the double-sided screen gesture operation, moves the first icon control from the main screen to the secondary screen, including: in response to the double-sided screen gesture operation, if the main screen is a user-facing screen, the mobile phone displays the first icon The control moves from the main screen to the secondary screen.
  • the mobile phone also needs to detect the specific orientation of the current mobile phone, and further determine whether to move the first icon control according to the specific orientation of the mobile phone. In other words, if the mobile phone determines that the screen on which the first icon control to be moved is located is facing away from the user, the mobile phone does not trigger the icon movement, thereby avoiding the user's misoperation.
  • the display interfaces of the main screen and the secondary screen of the flexible screen are both interfaces that can accommodate icon controls.
  • the display interface of the main screen of the above flexible screen displays icon controls, including the following two cases: In the first case, The display interface of the main screen of the flexible screen displays icon controls (that is, the main screen is in a bright screen state), and the display interface of the secondary screen of the flexible screen is an empty screen (that is, there is no icon control).
  • the mobile phone detects a double-sided screen gesture operation for triggering icon movement, and in response to the double-sided screen gesture operation, the mobile phone can move the first icon control from the main screen to the secondary screen.
  • icon controls are displayed on the display interface of the main screen of the flexible screen, and icon controls are displayed on the display interface of the secondary screen of the flexible screen.
  • the mobile phone detects the double-sided screen gesture operation used to trigger the icon movement. If only the first operation on the main screen corresponds to the first icon control, the mobile phone will move the first icon control from the main screen to the secondary screen; The first operation corresponds to the first icon control, and the second operation on the secondary screen corresponds to the second icon control.
  • the mobile phone can control the icon control on the user-facing screen to move to the screen facing away from the user.
  • the specific orientation of the mobile phone determines which screen icon controls move across screens. If the user-facing screen is the primary screen, the mobile phone will move the first icon control from the primary screen to the secondary screen. If the user-facing screen is the secondary screen, the mobile phone will move The second icon control moves from the secondary screen to the main screen.
  • the display interfaces of the main screen and the secondary screen are both interfaces that can accommodate icon controls
  • at least one icon control is displayed on the display interface of the primary screen
  • at least one icon control is displayed on the display interface of the secondary screen
  • the response is Double-sided screen gesture operation
  • the mobile phone moves the first icon control from the main screen to the front of at least one icon control on the secondary screen, and moves the at least one icon control on the secondary screen backward in sequence; or, in response to the double-sided screen gesture operation , The mobile phone moves the first icon control from the main screen to behind at least one icon control on the secondary screen.
  • This implementation limits the position where the first icon control moves to the secondary screen. If there are multiple icon controls on the secondary screen, the first icon control can be moved to the front or the back of the multiple icon controls. For example, there are originally two icon controls A and B arranged in order on the secondary screen. A is located in front of B. The mobile phone can move the icon control C on the main screen to the position of A, and A and B are moved backward in sequence, or C is moved After B, the positions of A and B on the secondary screen remain unchanged.
  • the mobile phone moves the first icon control from the main screen to the secondary screen, including: the mobile phone moves the first icon control from the main screen to the secondary screen at the touch position of the second operation. It should be noted that there may be no icon control at the touch position of the second operation of the secondary screen, or there may be an icon control. If there is no icon control, the mobile phone can directly move the first icon control to the touch position of the second operation on the secondary screen. If there is an icon control, you can refer to the following scheme.
  • the mobile phone moves the first icon control from the main screen to the touch position of the second operation of the secondary screen, including: the mobile phone moves the first icon control from the main screen To the touch position of the second operation of the secondary screen, and move the third icon control on the secondary screen back; or, the mobile phone moves the first icon control from the main screen to the touch position of the second operation of the secondary screen, and moves the secondary screen to the touch position of the second operation of the secondary screen.
  • the third icon control on the upper and the first icon control are merged.
  • the mobile phone needs to move all the icon controls behind the touch position sequentially, or be The moved first icon control is directly merged with the icon control at the touched position into the same folder.
  • the dual-screen display interface can accommodate icon controls, and the mobile phone can according to the user's display on the flexible screen.
  • Gesture operation on the double-sided screen control the operation of the corresponding icon control to move across the screen. In this way, the user can move the icon control corresponding to the operation to another screen without unfolding the mobile phone.
  • the above-mentioned double-sided screen gesture operation is more convenient and faster, and provides a better user experience for users.
  • the flexible screen of the mobile phone is in a folded state
  • the display interfaces of the main screen and the secondary screen are both interfaces that can accommodate icon controls
  • the user can use double-sided screen gestures Operation to move the icon control on the main screen to the secondary screen.
  • the flexible screen of the mobile phone is divided into a main screen 11 and a sub-screen 12, and the flexible screen is currently in a folded state.
  • the main screen 11 faces the user and is in a bright screen state.
  • the main screen 11 currently displays the first main interface of the mobile phone, and the first main interface includes 4 application program icons.
  • the secondary screen 12 faces the user and is in a bright screen state.
  • the secondary screen 12 displays the second main interface of the mobile phone, and the second main interface includes three application program icons.
  • the status monitoring service is acquiring the current flexible screen
  • the physical form and display status of the current flexible screen and the touch data on the primary screen 11 and the secondary screen 12 can be sent to the input management service in the application framework layer.
  • the input management service can determine whether to trigger the icon to move across the screen according to the physical form, display state, and touch data of the flexible screen. For example, when the flexible screen is in a folded state, icon controls are displayed on the display interfaces of both the main screen and the sub-screen, the operation time difference between the first operation on the main screen and the second operation on the sub-screen is less than the preset time difference, the first operation and the second operation The touch position of the second operation corresponds, the first operation on the main screen corresponds to the icon control (as shown in the application icon 4 on the main screen in Fig. 13a), and the operation type of the first operation and the second operation are the same (as shown in Fig. 13a).
  • the input management service can determine that the icon control corresponding to the first operation on the current main screen needs to be moved to the secondary screen, and the input management service can send an event of moving the main screen icon to the upper application.
  • the mobile phone can move the application icon 4 on the main screen 11 to a corresponding position on the secondary screen 12. If there is already an application icon 8 at the corresponding position on the secondary screen 12, as shown in Figure 13b, the mobile phone can merge the application icon 4 and the application icon 8 on the primary screen 11 into one folder, or, as shown in Figure 13c, The mobile phone can move the application icon 4 on the home screen 11 to the position of the application icon 8, and the application icon 8 moves backward in sequence.
  • the mobile phone detects the user's first and second operations on the primary and secondary screens, the first operation corresponds to
  • the icon control on the main screen can be quickly moved to the corresponding position on the secondary screen, which is convenient for the user to control the icon control on the screen when the mobile phone is in the folded state. Move across screens to improve the user experience when using a folding mobile phone.
  • the flexible screen of the mobile phone is in a folded state
  • the display interfaces of the main screen and the secondary screen are both interfaces that can accommodate icon controls
  • the user can use the double-sided screen Gesture operation moves the icon controls on the secondary screen to the main screen.
  • the flexible screen of the mobile phone is in a folded state
  • the display interfaces of the main screen and the secondary screen are both interfaces that can accommodate icon controls
  • the user can use
  • the double-sided screen gesture operation moves the icon control on the main screen (or sub screen) to the sub screen (or main screen).
  • the status monitoring service receives the user's touch operation on the main screen and the secondary screen, in addition to sending the current flexible screen physical form, display state, and touch data to the input management service in the application framework layer
  • the specific orientation of the phone is also sent.
  • the condition monitoring service determines the specific orientation of the mobile phone please refer to the above-mentioned embodiment.
  • the input management service can determine whether to trigger the icon to move across the screen according to the physical form, display state, touch data of the flexible screen, and the specific orientation of the mobile phone.
  • the display interfaces of the main screen and the secondary screen are both interfaces that can accommodate icon controls, and the touch operations on the primary and secondary screens correspond to the icon controls, and the first operation on the primary screen and the secondary screen
  • the operation time difference of the second operation is less than the preset time difference
  • the touch position of the first operation corresponds to the touch position of the second operation
  • the input management service can be further based on The specific orientation of the mobile phone determines whether to move the icon control of the main screen to the secondary screen or to move the icon control of the secondary screen to the main screen. If the current main screen faces the user, the mobile phone moves the icon controls of the main screen across screens; if the current secondary screen faces the user, the mobile phone moves the icon controls of the secondary screen across
  • Table 3 shows the response of the mobile phone to determine whether to trigger the icon movement according to the physical form of the flexible screen, the display state, the touch data on the main screen and the secondary screen, and the specific orientation of the mobile phone.
  • a mobile phone is taken as an example of an electronic device, and a method for controlling a double-sided screen deletion operation provided in an embodiment of the present application will be described in detail with reference to the accompanying drawings.
  • the mobile phone detects a double-sided screen gesture operation that triggers the deletion of the control.
  • the double-sided screen gesture operation includes The first operation and the second operation on the secondary screen, the operation time difference between the first operation and the second operation is less than the preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to the first operation on the main screen.
  • An optional control, the operation types of the first operation and the second operation are both sliding operations and the sliding directions are the same; in response to the above-mentioned double-sided screen gesture operation, the mobile phone deletes the first optional control on the main screen.
  • the above optional controls refer to the controls that can be deleted on the display interface of the screen.
  • the optional controls can be the conversation box control in the chat list interface, or the folder control in the file list interface, or the mobile phone.
  • the aforementioned sliding direction may be a horizontal direction, a vertical direction, and other directions that have a certain angle with the horizontal direction or the vertical direction, which is not limited in the present application.
  • the mobile phone detects the first operation on the main screen and the second operation on the secondary screen. After the second operation, according to the touch position and operation time of the first operation and the second operation on the screen, it can be judged whether the touch operation on the primary screen and the secondary screen meets the following four conditions:
  • the operation time difference between the first operation and the second operation is less than the preset time difference
  • the first operation corresponds to the first selectable control on the main screen
  • the operation types of the first operation and the second operation are both sliding operations and the sliding directions are the same.
  • the mobile phone deletes the first optional control on the main screen, otherwise the mobile phone does not trigger the deletion of the first optional control.
  • execution order of judging the above four conditions can be executed simultaneously or sequentially.
  • the execution order of sequential execution is not limited to the above execution order. This embodiment does not make any difference to the execution order of the above three conditions. limited.
  • the mobile phone in response to the double-sided screen gesture operation, deletes the first selectable control on the main screen, including: in response to the double-sided screen gesture operation, if the main screen is a user-facing screen, the mobile phone will set the first option on the main screen. An optional control is deleted.
  • the mobile phone also needs to detect the specific orientation of the current mobile phone, and further determine whether to delete the first selectable control on the main screen according to the specific orientation of the mobile phone. In other words, if the mobile phone determines that the screen where the first selectable control to be deleted is located is facing away from the user, the mobile phone does not trigger the delete control, thereby avoiding the user's misoperation.
  • the optional controls are displayed on the display interface of the main screen of the flexible screen, including the following situations:
  • the optional controls are displayed on the display interface of the main screen of the flexible screen (that is, the main screen is on. Screen state), and the secondary screen of the flexible screen is in a black screen state.
  • the mobile phone detects the double-sided screen gesture operation used to trigger the deletion of the control, and in response to the double-sided screen gesture operation, the mobile phone can delete the optional control on the main screen.
  • optional controls are displayed on the display interface of the main screen of the flexible screen, and the secondary screen of the flexible screen is in the on-screen state, but the display interface of the secondary screen has no optional controls.
  • optional controls are displayed on the display interfaces of the main screen and the secondary screen of the flexible screen.
  • the mobile phone detects the double-sided screen gesture operation used to trigger the delete control. If the first operation on the main screen corresponds to the first optional control, and the second operation on the secondary screen corresponds to the second optional control, the mobile phone will face The optional control corresponding to the operation on the user's screen is deleted. In other words, the mobile phone needs to determine which screen selectable control should be deleted according to the specific orientation of the current mobile phone. If the user-facing screen is the primary screen, the mobile phone deletes the first optional control on the main screen; if the user-facing screen is the secondary screen, the mobile phone deletes the second optional control on the secondary screen.
  • the method in response to the double-sided screen gesture operation, after the mobile phone deletes the first selectable control on the main screen, the method further includes: if there are one or more third selectable controls under the first selectable control, the mobile phone Move one or more third selectable controls up in order.
  • the mobile phone when the optional controls on the display interface of the flexible screen need to be deleted, if the current physical form of the flexible screen is in the folded state, the mobile phone can operate according to the user's double-sided screen gesture operation on the flexible screen, and will operate the corresponding option. Select the control to delete. In this way, the user can use the double-sided screen gestures to slide the optional controls to the edge of the screen or to a designated area on the screen, so as to quickly delete the optional controls, and can also avoid the user's single-screen sliding to delete the controls by mistake.
  • the flexible screen of the mobile phone is in the folded state
  • the display interface of the main screen displays optional controls (the main screen is in the on-screen state)
  • the secondary screen is in the black-screen state.
  • the optional controls on the main screen are deleted.
  • the mobile phone flexible screen is divided into a main screen 11 and a sub-screen 12, and the flexible screen is currently in a folded state.
  • the main screen 11 faces the user and is in a bright screen state
  • the main screen 11 currently displays a WeChat chat list interface, which includes three conversation box controls
  • the secondary screen 12 faces the user and is in a black screen state.
  • the touch device on the flexible screen detects the user's first operation on the main screen 11 and the second operation on the secondary screen 12, similar to the above embodiment, the status monitoring service is acquiring the current physical form of the flexible screen.
  • the physical form and display status of the current flexible screen and the touch data on the primary screen 11 and the secondary screen 12 can be sent to the input management service in the application framework layer .
  • the input management service can determine whether to trigger a delete operation according to the physical form, display state, and touch data of the flexible screen. For example, when the flexible screen is in a folded state, optional controls are displayed on the display interface of the main screen, and the secondary screen is in a black screen state.
  • the operation time difference between the first operation on the main screen and the second operation on the secondary screen is less than the preset time difference, the first The touch position of the operation corresponds to the second operation, the first operation corresponds to the first selectable control on the main screen, and the operation type of the first operation and the second operation are sliding operations and the sliding directions are the same, the input management service can determine that the main screen needs to be changed.
  • the first selectable control on the delete is
  • the input management service sends an event for instructing to delete the first selectable control on the main screen to the upper application.
  • the user uses a double-sided screen gesture operation to slide the conversation box 2 on the home screen a first distance from left to right, or slide to the right edge of the screen, or, as shown in Figure 15b, the user moves the The conversation box 2 slides from the top to the bottom for a second distance, or slides to the bottom edge of the screen.
  • the mobile phone can respond to the double-sided screen gesture operation to delete the conversation box 2 selected by the user from the display interface of the main screen.
  • Other conversation boxes (such as conversation box 3) move up in order.
  • the flexible screen when the flexible screen is in the folded state, optional controls are displayed on the display interface of the main screen (the main screen is in the bright screen state), and the secondary screen is in the black screen state, if the mobile phone detects the user's first operation on the main screen and The second operation on the secondary screen, the first operation corresponds to an optional control on the main screen.
  • the mobile phone determines that the above operation is a double-sided screen gesture operation, the optional control selected by the user on the main screen can be deleted, which is convenient for the user
  • the quick delete operation is performed on the foldable flexible screen, which improves the user experience when using a foldable mobile phone.
  • the flexible screen of the mobile phone is in a folded state
  • optional controls are displayed on the display interface of the secondary screen (the secondary screen is in the on-screen state)
  • the primary screen is in the black-screen state
  • the user can operate by gestures on the double-sided screen Delete the optional controls on the secondary screen in the bright screen state.
  • the implementation principle and technical effect are similar to the first scenario of this embodiment, and will not be repeated here.
  • the flexible screen of the mobile phone is in a folded state, and optional controls are displayed on the display interfaces of the main screen and the secondary screen.
  • the mobile phone deletes the optional controls on the user-facing screen in response to gesture operations on the double-sided screen.
  • the flexible screen is currently in a folded state
  • the main screen 11 of the flexible screen faces the user and the main screen 11 currently displays a file list interface.
  • the interface includes 2 folder controls
  • the secondary screen 12 of the flexible screen faces away.
  • the user and the secondary screen 12 currently display the WeChat chat list interface, which includes 3 conversation box controls.
  • the input management service can determine which screen selectable controls to delete according to the physical form, display state, touch data of the flexible screen, and the specific orientation of the mobile phone. As shown in Figure 16, the current user-oriented screen is the main screen 11. The mobile phone will delete the folder 2 on the main screen 11.
  • Table 4 shows that the mobile phone determines whether to trigger the control deletion response according to the physical form of the flexible screen, the display state, the touch data on the main screen and the secondary screen, and the specific orientation of the mobile phone.
  • the embodiment of the application discloses an electronic device including a processor, and a memory, an input device, and an output device connected to the processor.
  • the input device and the output device can be integrated into one device.
  • the touch device of the flexible screen can be used as the input device
  • the display of the flexible screen can be used as the output device.
  • the above electronic device may include: a flexible screen 1701, the flexible screen 1701 including a touch device 1706 and a display 1707; one or more processors 1702; one or more memories 1703; one or A plurality of sensors 1708; the memory 1703 stores one or more application programs (not shown) and one or more programs 1704, and each of the above-mentioned devices can communicate through one or more communication buses 1705.
  • the one or more programs 1704 are stored in the aforementioned memory 1703 and are configured to be executed by the one or more processors 1702, so that the electronic device executes the steps in the aforementioned embodiments.
  • all the relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the above-mentioned processor 1702 may be specifically the processor 110 shown in FIG. 1
  • the above-mentioned memory 1703 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 1
  • the above-mentioned flexible screen 1701 may specifically be FIG. 1
  • the above-mentioned sensor 1708 may specifically be a gyroscope sensor 180B, an acceleration sensor 180E, and a proximity sensor 180G in the sensor module 180 shown in FIG. Do not make any restrictions.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, and the computer software products are stored in a storage
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Abstract

一种具有柔性屏幕的电子设备的控制方法及电子设备,可根据柔性屏幕的物理形态、显示状态以及用户在双面屏上的触控操作,确定是否触发屏幕截图、控件多选、图标移动、控件删除的响应,提高了电子设备的人机交互效率,为用户提供更好的使用体验。柔性屏幕的物理形态包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为主屏和副屏,该方法包括:当柔性屏幕处于亮屏状态时,电子设备检测到用户在主屏上的第一操作和在副屏上的第二操作,第一操作和第二操作的操作时间差小于预设时间差,且第一操作和第二操作的触摸位置对应,电子设备响应于主屏和副屏上的上述操作,对处于亮屏状态的柔性屏幕进行屏幕截图。

Description

具有柔性屏幕的电子设备的控制方法及电子设备
本申请要求于2019年09月29日提交中国专利局、申请号为201910935961.4、申请名称为“具有柔性屏幕的电子设备的控制方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及终端技术领域,尤其涉及一种具有柔性屏幕的电子设备的控制方法及电子设备。
背景技术
随着电子设备的不断发展,具有显示屏的电子设备(例如手机、平板电脑等)已成为人们日常生活和工作中不可或缺的部分。为了给用户提供更丰富的显示信息,电子设备的显示屏变得越来越大,屏幕过大影响其便携性。因此,具有柔性屏幕的电子设备成为未来电子设备的发展方向。
柔性屏幕也可称为柔性OLED(organic light-emitting diode,有机发光二极管),相较于传统屏幕,柔性屏幕不仅在体积上更加轻薄,同时基于其可弯曲、柔韧性佳的特性,柔性屏幕的耐用程度也大大高于传统屏幕。目前一些设备商已经将柔性屏幕应用在手机、平板电脑等电子设备中,用户可以在柔性屏幕处于折叠状态或者展开状态下,通过对手机的功能组合键(例如音量键与开关键的组合)或者系统级按钮进行操作,触发对亮屏状态的屏幕(单个屏幕或整个柔性屏幕)的截图。然而功能组合键通常需要双手操作,且对按压时机有较高要求,系统级按钮需要较多步骤才可触发截图,操作步骤较繁琐。可见,目前可折叠手机的屏幕截图方式不够便捷,人机交互效率低下。对于控件多选、图标移动、控件删除等操作也存在类似的问题。
发明内容
本申请提供一种具有柔性屏幕的电子设备的控制方法及电子设备,电子设备可响应于双面屏手势操作进行屏幕截图、控件多选、图标移动、控件删除,提高了人机交互效率。
第一方面,本申请提供一种具有柔性屏幕的电子设备的控制方法,该柔性屏幕的物理形态可包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为第一屏和第二屏,具体的,上述控制方法包括:当柔性屏幕处于亮屏状态时,电子设备检测到用于触发屏幕截图的双面屏手势操作,双面屏手势操作包括对第一屏的第一操作和对第二屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应;响应于双面屏手势操作,电子设备对处于亮屏状态的柔性屏幕进行屏幕截图。
示例性的,电子设备可根据加速度传感器和陀螺仪检测到的数据计算第一屏和第二屏之间的夹角,进而根据该夹角确定柔性屏幕处于展开状态或折叠状态。当然,电子设备还可以通过其他方式确定柔性屏幕当前的物理形态,例如,电子设备可使用距离传感器检测 主屏和副屏之间的距离,进而根据该距离确定柔性屏幕当前的物理形态,本申请实施例对此不做任何限制。
应理解,用户在屏幕(第一屏或者第二屏)上触发操作的操作时间是指用户的手指或者手掌接触到屏幕的起始时刻。相应的,上述的第一操作和第二操作的操作时间差是指用户的手指或手掌接触到第一屏的起始时刻,与用户的手指或手掌接触到第二屏的起始时刻的时间差。示例性的,预设时间差可以设置为500ms及以下。第一操作的触控位置与第二操作的触摸位置对应可以理解为第一操作和第二操作的触摸位置的位置差小于预设位置差,由于第一操作和第二操作的触摸位置在不同屏幕上,因此手机需要通过屏幕坐标系转换,将第一操作和第二操作在屏幕上的触控点转换至同一屏幕坐标系,例如第一屏坐标系、第二屏坐标系或者区别于第一屏和第二屏的其他坐标系。示例性的,预设位置差可以设置为10dp。
需要说明的是,上述的第一屏和第二屏仅用于区分电子设备的柔性屏幕的显示区域,并不代表屏幕的重要性或主次。
上述方案中,在需要对柔性屏幕进行屏幕截图时,如果柔性屏幕当前的物理形态为折叠状态,电子设备可根据用户在柔性屏幕上的双面屏手势操作以及柔性屏幕当前的显示状态(亮屏状态或者黑屏状态)对相应的屏幕进行屏幕截图。这样,用户无需在电子设备的柔性屏幕上调取通知栏或者操作控件(如虚拟按钮),再点相应截图控件进行屏幕内容截取。上述双面屏手势操作更加方便、快捷,为用户提供更好的使用体验。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备对处于亮屏状态的柔性屏幕进行屏幕截图,包括:若第一屏处于亮屏状态,第二屏处于黑屏状态,响应于双面屏手势操作,电子设备对第一屏进行屏幕截图;若第一屏处于黑屏状态,第二屏处于亮屏状态,响应于双面屏手势操作,电子设备对第二屏进行屏幕截图;若第一屏和第二屏均处于亮屏状态,响应于双面屏手势操作,电子设备对整个柔性屏幕进行屏幕截图。示例性的,电子设备可通过显示驱动获取第一屏和第二屏的显示参数,从而确定用户在触发双面屏手势操作之前柔性屏幕的显示状态,例如第一屏处于亮屏状态,第二屏处于黑屏状态。该实现方式,电子设备可根据用户在触发双面屏手势操作之前的屏幕显示状态,对亮屏状态的一个屏幕或者整个屏幕进行屏幕截图。
在一种可能的实现方式中,在电子设备对处于亮屏状态的柔性屏幕进行屏幕截图之后,还包括:显示截屏预览界面。示例性的,截屏预览界面可以包括以下至少一种处理控件:保存控件、编辑控件、分享控件或取消控件。
在一种可能的实现方式中,显示截屏预览界面,包括:若第一屏处于亮屏状态,第二屏处于黑屏状态,在第一屏上显示截屏预览界面;若第一屏处于黑屏状态,第二屏处于亮屏状态,在第二屏上显示截屏预览界面;若第一屏和第二屏均处于亮屏状态,在面向用户的屏幕上显示截屏预览界面,面向用户的屏幕为第一屏或第二屏。该实现方式,电子设备在完成屏幕截图后,可在处于亮屏状态的屏幕上显示截屏预览界面,如果柔性屏幕的两个屏幕均处于亮屏状态,电子设备需要确定朝向用户的屏幕,并在朝向用户的屏幕上显示截屏预览界面。示例性的,电子设备通过红外传感器、相机、接近光传感器或柔性屏幕的触控器件中至少一项上报的检测数据确定朝向用户的屏幕。
在一种可能的实现方式中,在显示截屏预览界面之后,还包括:电子设备检测到用于 触发对截屏预览界面上的一个处理控件选择的第三操作;响应于第三操作,电子设备对屏幕截图执行选择的处理控件对应的执行处理。
可见,电子设备在完成屏幕截图后,在屏幕上显示屏幕预览界面,用户可以在截屏预览界面上点击相应的处理控件,对屏幕截图进行保存、编辑、分享、取消等操作,例如用户点击截屏预览界面上的编辑控件,可以在屏幕截图上添加文字、贴图或者滤镜等,又例如用户点击截屏预览界面上的分享控件,可以通过应用程序将屏幕截图发送给朋友或家人。
在一种可能的实现方式中,第一操作和第二操作的操作类型相同,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。其中,按压操作也称为压感操作,应满足以下条件:用户在屏幕上的触控操作的操作时长大于或者等于预设时长,触控操作的触控点的坐标位置未发生变化,触控操作的按压值大于或者等于预设压力值。点击操作应满足:用户在屏幕上的触控操作的操作时长小于预设时长。双击操作应满足:用户在屏幕上触发两次点击操作,两次点击操作的操作时间间隔小于预设时间间隔。长按操作应满足以下条件:用户在屏幕上的触控操作的操作时长大于或者等于预设时长,触控操作的按压值小于预设压力值。
第二方面,本申请提供一种具有柔性屏幕的电子设备的控制方法,该柔性屏幕的物理形态包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为第一屏和第二屏,具体的,上述控制方法包括:当第一屏的显示界面上显示至少两个可选控件时,电子设备检测到用于触发多选的双面屏手势操作,双面屏手势操作包括对第一屏的第一操作和对第二屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应;响应于双面屏手势操作,电子设备控制第一屏的显示界面进入多选模式,其中,在多选模式中,第一屏的显示界面上的至少两个可选控件能够都被选中。
需要说明的是,上述可选控件是指屏幕的显示界面上可以被选中和处理的控件。例如可选控件可以是相册中的图片浏览界面上的图片控件,用户可以同时选中多个图片控件,对多个图片进行拼接处理(即图片编辑),或者,将多个图片批量发送给朋友(即图片共享),或者,将多个图片添加至新的相册文件夹中(即图片整理)。上述多选模式是指用户可以在屏幕的显示界面上选择至少两个可选控件,并进行批量处理的操作模式。
上述方案中,在需要对柔性屏幕的显示界面上的至少两个可选控件进行批量处理时,如果柔性屏幕当前的物理形态为折叠状态,电子设备可根据用户在柔性屏幕上的双面屏手势操作,控制相应屏幕的显示界面快速进入多选模式。这样,用户无需通过点击指定位置上的“选择”控件(例如屏幕左上角)进入多选模式。上述双面屏手势操作更加方便、快捷,为用户提供更好的使用体验。
在一种可能的实现方式中,当柔性屏幕处于折叠状态,第一屏的显示界面上显示至少两个可选控件,第二屏处于黑屏状态,或者第二屏处于亮屏状态但第二屏的显示界面上没有可选控件时,电子设备检测到用于触发多选的双面屏手势操作,响应于双面屏手势操作,电子设备可控制第一屏的显示界面进入多选模式。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备控制第一屏的显示界面进入多选模式,包括:响应于双面屏手势操作,若第一屏为面向用户的屏幕,电子设备控制第一屏的显示界面进入多选模式。该实现方式,电子设备还需要检测当前电子设备的具 体朝向,进一步根据电子设备的具体朝向确定是否控制第一屏的显示界面进入多选模式。也就是说,如果电子设备待控制的屏幕背向用户,则电子设备不触发相应的控制,从而避免用户的误操作。
在一种可能的实现方式中,当第一屏的显示界面上显示至少两个可选控件,且第二屏的显示界面上显示至少两个可选控件时,响应于双面屏手势操作,电子设备控制面向用户的屏幕的显示界面进入多选模式,面向用户的屏幕为第一屏或第二屏。
该实现方式,电子设备的两个屏的显示界面均有可选控件,电子设备根据当前电子设备的具体朝向确定哪个屏幕的显示界面进入多选模式,若面向用户的屏幕为第一屏,则电子设备控制第一屏的显示界面进入多选模式,若面向用户的屏幕为第二屏,则电子设备控制第二屏的显示界面进入多选模式。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备控制第一屏的显示界面进入多选模式,包括:若对第一屏的第一操作的触摸位置对应第一可选控件,第一可选控件为第一屏上显示的其中一个可选控件,响应于双面屏手势操作,电子设备控制第一屏的显示界面进入多选模式,并将第一可选控件显示为选中。用户可以通过该实现方式,在触发第一屏的显示界面进入多选模式的同时,直接选中第一屏上的一个可选控件。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备控制第一屏的显示界面进入多选模式之后,还包括:电子设备检测到用于触发对第一屏的显示界面上的可选控件选择的第三操作,第三操作的操作类型为点击操作或滑动操作;响应于第三操作,电子设备将第三操作对应的一个或多个控件显示为选中。用户可以通过该实现方式,增加一个或多个可选控件,完成多选操作。
在一种可能的实现方式中,第一操作和第二操作的操作类型相同,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
第三方面,本申请提供一种具有柔性屏幕的电子设备的控制方法,该柔性屏幕的物理形态包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为第一屏和第二屏,具体的,上述控制方法包括:当第一屏和第二屏的显示界面均为可容纳图标控件的界面,且第一屏的显示界面上显示有图标控件时,电子设备检测到用于触发图标移动的双面屏手势操作,双面屏手势操作包括对第一屏的第一操作和对第二屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应,第一操作对应第一屏上的第一图标控件;响应于双面屏手势操作,电子设备将第一图标控件从第一屏移动至第二屏。
需要说明的是,上述图标控件是可以被移动的图标控件,显示界面上的图标控件可以按照九宫格、十六宫格等形式排布。具体的,上述图标控件可以是电子设备主界面上的应用程序的图标控件,也可以是某一应用中的图标控件,对此本申请不做限定。
上述方案中,在需要对柔性屏幕的显示界面上的图标控件进行跨屏幕移动时,如果柔性屏幕当前的物理形态为折叠状态,双屏显示界面均可容纳图标控件,电子设备可根据用户在柔性屏幕上的双面屏手势操作,控制操作对应的图标控件进行跨屏幕移动。这样,用户可以在不展开电子设备的情况下,将操作对应的图标控件移动至另一个屏幕上。上述双面屏手势操作更加方便、快捷,为用户提供更好的使用体验。
在一种可能的实现方式中,当柔性屏幕处于折叠状态,第一屏的显示界面上显示有图 标控件,第二屏的显示界面为空屏(即没有图标控件)时,电子设备检测到用于触发图标移动的双面屏手势操作,响应于上述双面屏手势操作,电子设备将第一图标控件从第一屏移动至第二屏。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备将第一图标控件从第一屏移动至第二屏,包括:响应于双面屏手势操作,若第一屏为面向用户的屏幕,电子设备将第一图标控件从第一屏移动至第二屏。
该实现方式,电子设备还需要检测当前电子设备的具体朝向,进一步根据电子设备的具体朝向确定是否移动第一图标控件。也就是说,如果电子设备确定待移动的第一图标控件所在的屏幕背向用户,则电子设备不触发图标移动,从而避免用户的误操作。
在一种可能的实现方式中,当第一屏和第二屏的显示界面均为可容纳图标控件的界面,第一屏的显示界面上显示至少一个图标控件,且第二屏的显示界面上显示至少一个图标控件时,若第一操作对应第一屏的显示界面上的第一图标控件,且第二操作对应第二屏的显示界面上的第二图标控件,响应于双面屏手势操作,若第一屏面向用户,电子设备将第一图标控件从第一屏移动至第二屏;若第二屏面向用户,电子设备将第二图标控件从第二屏移动至第一屏。
该实现方式,电子设备的两个屏的显示界面均有图标控件,且两个屏上的触控操作均对应图标控件,电子设备可进一步根据当前电子设备的具体朝向确定哪个屏幕上的图标控件应当被移动,若面向用户的屏幕为第一屏,则电子设备控制第一屏上的图标控件移动至第二屏,若面向用户的屏幕为第二屏,则电子设备控制第二屏上的图标控件移动至第一屏。
在一种可能的实现方式中,当第一屏和第二屏的显示界面均为可容纳图标控件的界面,第一屏的显示界面上显示至少一个图标控件,且第二屏的显示界面上显示至少一个图标控件时,响应于双面屏手势操作,电子设备将第一图标控件从第一屏移动至第二屏上的至少一个图标控件的前面,并将第二屏上的至少一个图标控件顺序后移;或者,响应于双面屏手势操作,电子设备将第一图标控件从第一屏移动至第二屏上的至少一个图标控件的后面。
该实现方式限定了第一图标控件移动至第二屏的位置,如果第二屏上原本就有多个图标控件,可以将第一图标控件移动至多个图标控件的最前面或者最后面。例如,第二屏上原本有顺序排列的两个图标控件A和B,A位于B的前面,电子设备可以将第一屏上的图标控件C移动至A的位置,A和B顺序后移,或者,C移动至B的后面,A和B在第二屏的位置不变。
在一种可能的实现方式中,电子设备将第一图标控件从第一屏移动至第二屏,包括:电子设备将第一图标控件从第一屏移动至第二屏的第二操作的触摸位置。
需要说明的是,第二屏的第二操作的触摸位置处可能没有图标控件,也可能有图标控件。如果没有图标控件,电子设备可以直接将第一图标控件移动至第二屏的第二操作的触摸位置。如果有图标控件,可以参见下述方案。
在一种可能的实现方式中,若第二屏的触摸位置上有第三图标控件,电子设备将第一图标控件从第一屏移动至第二屏的第二操作的触摸位置,包括:电子设备将第一图标控件从第一屏移动至第二屏的第二操作的触摸位置,并将第二屏上的第三图标控件后移;或者,电子设备将第一图标控件从第一屏移动至第二屏的第二操作的触摸位置,并将第二屏上的第三图标控件和第一图标控件合并。也就是说,如果第二屏的第二操作的触摸位置有图标 控件,被移动的第一图标控件仍放置在该触摸位置,电子设备需要将该触摸位置之后的所有图标控件顺序后移,或者,被移动的第一图标控件直接与该触摸位置的图标控件合并至同一文件夹。
在一种可能的实现方式中,第一操作和第二操作的操作类型相同,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
第四方面,本申请提供一种具有柔性屏幕的电子设备的控制方法,该柔性屏幕的物理形态包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为第一屏和第二屏,具体的,上述控制方法包括:当第一屏的显示界面上显示有可选控件时,电子设备检测到用于触发删除控件的双面屏手势操作,双面屏手势操作包括对第一屏的第一操作和对第二屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应,第一操作对应第一屏上的第一可选控件,第一操作和第二操作的操作类型均为滑动操作且滑动方向一致;响应于双面屏手势操作,电子设备将第一屏上的第一可选控件删除。
需要说明的是,上述可选控件是指屏幕的显示界面上可以被删除的控件,例如可选控件可以是聊天列表界面中的会话框控件,或者,文件列表界面中的文件夹控件,或者手机主界面的应用图标控件等等。上述滑动方向可以是水平方向、垂直方向、与水平方向或垂直方向具有一定夹角的其他方向,对此本申请不做限定。
上述方案中,在需要删除柔性屏幕的显示界面上的可选控件时,如果柔性屏幕当前的物理形态为折叠状态,电子设备可根据用户在柔性屏幕上的双面屏手势操作,将操作对应的可选控件删除。这样,用户可以使用双面屏手势将可选控件滑动至屏幕边缘或者屏幕上指定区域,实现对可选控件的快速删除,还可以避免用户单屏滑动对控件的误删除操作。
在一种可能的实现方式中,当柔性屏幕处于折叠状态,第一屏的显示界面上显示有可选控件,第二屏处于黑屏状态,或者第二屏处于亮屏状态但第二屏的显示界面上没有可选控件时,电子设备检测到用于触发删除控件的双面屏手势操作,响应于双面屏手势操作,电子设备将第一屏上的第一可选控件删除。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备将第一屏上的第一可选控件删除,包括:响应于双面屏手势操作,若第一屏为面向用户的屏幕,电子设备将第一屏上的第一可选控件删除。该实现方式,电子设备还需要检测当前电子设备的具体朝向,进一步根据电子设备的具体朝向确定是否将第一屏上的第一可选控件删除。也就是说,如果电子设备确定待删除的第一可选控件所在的屏幕背向用户,则电子设备不触发删除控件,从而避免用户的误操作。
在一种可能的实现方式中,当第一屏的显示界面上显示有可选控件,且第二屏的显示界面上显示有可选控件时,若第一操作对应第一屏上的第一可选控件,且第二操作对应第二屏上的第二可选控件;响应于双面屏手势操作,若第一屏面向用户,电子设备将第一屏上的第一可选控件删除;若第二屏面向用户,电子设备将第二屏上的第二可选控件删除。
该实现方式,电子设备的两个屏的显示界面均有可选控件,且两个屏上的触控操作均对应可选控件,电子设备可进一步根据当前电子设备的具体朝向确定哪个屏幕上的可选控件应当被删除,若面向用户的屏幕为第一屏,则电子设备将第一屏上的可选控件删除,若面向用户的屏幕为第二屏,则电子设备将第二屏上的可选控件删除。
在一种可能的实现方式中,响应于双面屏手势操作,电子设备将第一屏上的第一可选控件删除之后,还包括:若第一可选控件的下方有一个或多个第三可选控件,电子设备将一个或多个第三可选控件顺序上移。
第五方面,本申请提供一种电子设备,该电子设备包括:柔性屏幕、一个或多个处理器、一个或多个存储器、一个或多个传感器;该柔性屏幕包括显示器和触控器件,柔性屏幕的物理形态包括展开状态和折叠状态,当柔性屏幕处于折叠状态时,柔性屏幕被划分为主屏和副屏;上述存储器存储有一个或多个应用程序以及一个或多个程序,其中一个或多个程序包括指令,当指令被电子设备执行时,使得电子设备执行上述任一项所述的控制方法。
第六方面,本申请提供一种计算机可读存储介质,计算机可读存储介质中存储有指令,当指令在电子设备上运行时,使得电子设备执行上述任一项所述的控制方法。
第七方面,本申请提供一种包含指令的计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述任一项所述的控制方法。
可以理解,上述提供的第五方面所述的电子设备、第六方面所述的计算机可读存储介质,以及第七方面所述的计算机程序产品均用于执行上文所提供的对应的控制方法,因此,其所能达到的有益效果可参考上文所提供的对应的控制方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种电子设备的结构示意图一;
图2为本申请实施例提供的一种电子设备的结构示意图二;
图3a为本申请实施例提供的一种电子设备内操作系统的架构示意图一;
图3b为本申请实施例提供的一种电子设备内操作系统的架构示意图二;
图4为本申请实施例提供的用户使用双面屏手势操作时的握持状态示意图;
图5为本申请实施例提供的一种电子设备的结构示意图三;
图6为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的流程示意图一;
图7为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图一;
图8为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图二;
图9为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的流程示意图二;
图10为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图三;
图11为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图四;
图12为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的流程示意图三;
图13a为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图五;
图13b为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图六;
图13c为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图七;
图14为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的流程示意图四;
图15a为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图八;
图15b为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图九;
图16为本申请实施例提供的具有柔性屏幕的电子设备的控制方法的场景示意图十;
图17为本申请实施例提供的一种电子设备的结构示意图四。
具体实施方式
下面将结合附图对本实施例的实施方式进行详细描述。
本申请实施例提供的一种具有柔性屏幕的电子设备的控制方法,可应用于手机、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴设备、虚拟现实设备等具有柔性屏幕的电子设备中,本申请实施例对此不做任何限制。
以手机100为上述电子设备举例,图1示出了手机的结构示意图。
手机100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,射频模块150,通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,柔性屏幕301,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
可以理解的是,本申请实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是手机100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传 感器180K通过I2C总线接口通信,实现手机100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与通信模块160。例如:处理器110通过UART接口与通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与柔性屏幕301,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现手机100的拍摄功能。处理器110和柔性屏幕301通过DSI接口通信,实现手机100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,柔性屏幕301,通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为手机100充电,也可以用于手机100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机100的结构限定。在本申请另一些实施例中,手机100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过手机100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,柔性屏幕301,摄像头193,和通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理 模块140也可以设置于同一个器件中。
手机100的无线通信功能可以通过天线1,天线2,射频模块150,通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
射频模块150可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。射频模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。射频模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。射频模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,射频模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,射频模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过柔性屏幕301显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与射频模块150或其他功能模块设置在同一个器件中。
通信模块160可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。通信模块160可以是集成至少一个通信处理模块的一个或多个器件。通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机100的天线1和射频模块150耦合,天线2和通信模块160耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机100通过GPU,柔性屏幕301,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接柔性屏幕301和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。在本申请实施例中,柔性屏幕301中可包括显示器和触控器件。显示器用于向用户输出显示内容,触控器件用于接收用户在柔性屏幕301上输入的触控事件。应理解,本文所涉及的触控器件可以与显示器一起集成在柔性屏幕中,该触控器件与显示器保持连接;该触控器件也可以独立设置,并与柔性屏幕的显示器保持连接。
在本申请实施例中,柔性屏幕301的物理形态可以划分为折叠状态、支架状态和展开状态。在另一些实施例中,柔性屏幕301的物理形态可以仅划分为折叠状态和展开状态。
如图2中的(a)所示,柔性屏幕301在展开状态下可作为一块完整的显示区域进行显示,用户可以沿柔性屏幕301中的一条或多条折叠线折叠屏幕。其中,折叠线的位置可以是预先设置的,也可以是用户在柔性屏幕301中任意选择的。
如图2中的(b)所示,用户沿柔性屏幕301中的折叠线AB折叠柔性屏幕301后,柔性屏幕301可沿AB折叠线被划分为两个显示区域。在本申请实施例中,折叠后的两个显示区域可以作为两个独立的显示区域进行显示。例如,可以将折叠线AB右侧显示区域称为手机100的主屏11,将折叠线AB左侧显示区域称为手机100的副屏12。主屏11和副屏12的显示面积可以相同或不同。需要说明的是,此处的主屏和副屏仅为区分两侧的显示区域,并不代表屏幕的重要性或主次;也可以将主屏和副屏分别称为第一屏和第二屏,本发明实施例对此不做限定。
当用户折叠柔性屏幕301之后,被划分出的主屏11和副屏12之间呈一定夹角。在本申请实施例中,手机100可通过一个或多个传感器(例如陀螺仪和加速度传感器)检测到的数据计算主屏和副屏之间的夹角。可以理解的是,主屏11和副屏12之间的夹角β在0至180°构成的闭区间内。当主屏11和副屏12之间的夹角β大于第一阈值(例如β为170°)时,手机100可确定柔性屏幕301处于展开状态,如图2中的(a)。当主屏11和副屏12之间的夹角β在一个预设区间内(例如,β在40°至60°之间)时,手机100可确定柔性屏幕301处于支架状态,如图2中的(b)。又或者,当主屏11和副屏12之间的夹角β小于第二阈值(例如β为20°)时,手机100可确定柔性屏幕301处于折叠状态,如图2中的(c)。在另一些屏幕状态只划分为两个状态的实施例中,主屏11和副屏12之间的夹角β在大于第三阈值(例如,45°或60°)时,手机100可确定柔性屏幕301处于展开状态;当主屏11和副屏12之间的夹角β小于该第三阈值时,手机100可确定柔性屏幕301处于折叠状态。
需要说明的是,用户沿折叠线AB折叠柔性屏幕301后,主屏与副屏可以相对设置,或者,主屏与副屏也可以互相背离。如图2中的(c)所示,用户折叠柔性屏幕301后,主屏与副屏互相背离,此时主屏与副屏均暴露在外部环境中,用户可以使用主屏进行显示,也可以使用副屏进行显示。
在一些实施例中,如图2中的(c)所示,用户折叠柔性屏幕301后,弯折部分的屏幕(也可称为侧屏)也可作为独立的显示区域,此时,柔性屏幕301被划分为主屏、副屏以及侧屏三个独立的显示区域。
在本申请实施例中,当手机100的柔性屏幕301处于亮屏状态时,例如主屏11处于亮屏状态,副屏12处于黑屏(或息屏)状态,或者,主屏11处于黑屏状态,副屏12处于亮屏状 态,又或者,主屏11和副屏12均处于亮屏状态,手机100可根据柔性屏幕301的物理形态、显示状态以及用户在主屏11和副屏12上的触控操作,确定是否触发屏幕截图、控件多选、图标移动、控件删除的响应等。
上述传感器模块180可以包括陀螺仪,加速度传感器,压力传感器,气压传感器,磁传感器(例如霍尔传感器),距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,热释电红外传感器,环境光传感器或骨传导传感器等一项或多项,本申请实施例对此不做任何限制。
手机100可以通过ISP,摄像头193,视频编解码器,GPU,柔性屏幕301以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机100可以支持一种或多种视频编解码器。这样,手机100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现手机100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口 170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机100可以设置至少一个麦克风170C。在另一些实施例中,手机100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。手机100可以接收按键输入,产生与手机100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于柔性屏幕301不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机100的接触和分离。手机100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。手机100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机100中,不能和手机100分离。
上述手机100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明手机100的软件结构。
图3a是本申请实施例的手机100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用 程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。如图3a所示,应用程序层内可以安装相机,图库,日历,通话,地图,导航,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图3a所示,应用程序框架层可以包括输入管理服务(input manager service,IMS)。当然,应用程序框架层中还可以包括显示策略服务、电源管理服务(power manager service,PMS)、显示管理服务(display manager service,DMS)、活动管理器、窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等,本申请实施例对此不作任何限制。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:状态监测服务,表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。状态监测服务用于根据内核层上报的监测数据确定手机的具体朝向、柔性屏幕的物理状态等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,传感器驱动,TP驱动,摄像头驱动,音频驱动等,本申请实施例对此不做任何限制。
仍如图3a所示,应用程序框架层以下的系统库和内核层等可称为底层系统,底层系统中包括用于提供显示服务的底层显示系统,例如,底层显示系统包括内核层中的显示驱动以及系统库中的表面管理器等。并且,本申请中的底层系统还包括用于识别柔性屏幕物理形态变化的状态监测服务,该状态监测服务可独立设置在底层显示系统内,也可设置在系统库和/或内核层内。
示例性的,状态监测服务可调用传感器服务(sensor service)启动陀螺仪、加速度传感器等传感器进行检测。状态监测服务可根据各个传感器上报的检测数据计算当前主屏和副屏之间的夹角。这样,通过主屏和副屏之间的夹角,状态监测服务可确定出柔性屏幕处于展开状态、折叠状态或支架状态等物理形态。并且,状态监测服务可将确定出的物理形态上报给上述输入管理服务。
在一些实施例中,当状态监测服务确定出当前手机处于折叠状态或支架状态时,状态监测服务还可以启动摄像头、红外传感器、接近光传感器或触控器件(touch panel,TP)等传感器识别手机的具体朝向。例如,手机的具体朝向可以包括主屏朝向用户或副屏朝向用户等。应理解,本文所涉及的主屏或副屏或整个柔性屏幕朝向用户包括:主屏或副屏 或整个柔性屏幕与用户面部之间以基本平行的角度朝向用户,也包括主屏或副屏或整个柔性屏幕以一定的倾斜角度朝向用户。
示例性的,输入管理服务可用于在接收到在屏幕上的触控事件时,从底层系统中获取当前手机柔性屏幕的物理形态、显示状态、屏幕触控数据、手机的具体朝向等。其中屏幕触控数据包括触控点在主屏或者副屏上的坐标位置、触控时间以及压力值等。进而,输入管理服务可根据柔性屏幕的物理形态、显示状态以及屏幕触控数据确定是否触发自定义的双面屏手势事件,例如双面屏屏幕截图、控件多选、图标移动、控件删除等事件。当输入管理服务确定触发上述任一双面屏手势事件时,输入管理服务可调用系统接口向上层应用上报双面屏手势事件,以使上层应用完成相应的操作。
需要指出的是,上述的上层应用是指用户预设的可响应于双面屏手势事件的注册应用。具体的,上层应用可以是手机系统自带的应用,例如相册应用、通讯录应用、浏览器应用等,还可以是任意第三方应用,对此本申请不作限定。用户可以根据自身需求对上述任一应用设置双面屏手势触发屏幕截图、控件多选、图标移动或者控件删除等功能。
与图3a类似,如图3b所示,为安卓操作系统内部的数据流向示意图。示例性的,硬件层的陀螺仪和加速度传感器可将检测到的数据上报给传感器驱动,传感器驱动通过传感器服务将陀螺仪和加速度传感器检测到的数据上报状态监测服务,状态监测服务可根据陀螺仪和加速度传感器检测到的数据确定主屏和副屏之间的夹角,进而确定出柔性屏幕的物理形态。硬件层的触控器件可通过TP驱动将检测到的屏幕触控数据上报给状态监测服务,硬件层的相机可通过相机驱动将检测到的数据上报给状态监测服务,硬件层的红外传感器可通过红外驱动将检测到的数据上报给状态监测服务。状态监测服务可根据触控器件、相机或红外传感器上报的数据确定出手机的具体朝向。另外,状态监测服务还可通过显示驱动获取柔性屏幕的显示参数,从而确定柔性屏幕的显示状态。
状态监测服务可将确定出的手机柔性屏幕的物理形态、显示状态、手机的具体朝向,以及屏幕触控数据上报给输入管理服务,最终由输入管理服务判断是否触发自定义的双面屏手势事件。
基于上述手机100的硬件和软件结构,下面以具体地实施例对本申请提供的具有柔性屏幕的电子设备的控制方法进行详细说明。
首先,对下面几个实施例中均涉及的双面屏手势操作进行如下说明。
下面几个实施例中,手机的柔性屏幕均处于折叠状态。示例性的,用户使用双面屏手势操作时的握持状态可参见图4。如图4所示,用户单手(左手或者右手)握持处于折叠状态的手机,用户可通过单手的拇指与食指相配合实现双面屏手势操作。示例性的,用户在主屏11上使用右手拇指触发第一操作,在副屏12上使用右手食指触发第二操作,第一操作和第二操作的操作时间差小于预设时间差,且第一操作的触控位置与第二操作的触摸位置对应时,手机可确定该手势操作为双面屏手势操作。
需要说明的是,用户在屏幕(主屏或者副屏)上的触发操作的操作时间是指用户的手指或者手掌接触到屏幕的起始时刻。相应的,上述第一操作和第二操作的操作时间差是指用户的手指或手掌接触到主屏的起始时刻,与用户的手指或手掌接触到副屏的起始时刻的时间差。示例性的,预设时间差可以设置为500ms及以下。
需要说明的是,第一操作的触控位置与第二操作的触摸位置对应可以理解为第一操作 和第二操作的触摸位置的位置差小于预设位置差。具体可以通过如下任一方式进行判断:
在一种实现方式中,手机可根据主屏和副屏的相对位置关系,确定主屏坐标系和副屏坐标系的坐标原点的位置关系,根据坐标原点的位置关系,将第二操作在副屏上的触控点的坐标位置转换至主屏坐标系中,计算第一操作在主屏上的触控点与第二操作转换至主屏上的触控点之间的位置差,判断该位置差是否小于预设位置差。示例性的,预设位置差可以设置为10dp。图5示出了处于折叠状态下的柔性屏幕,手机检测到用户在主屏11上的第一操作以及在副屏12上的第二操作,第一操作在主屏11上的触控点为触控点1,第二操作在副屏12上的触控点为触控点2,手机可根据主屏11和副屏12的相对位置关系,将副屏12上的触控点2转换至主屏11上,触控点2对应触控点2'。手机可根据主屏11上的触控点1与触控点2'的位置确定位置差r,进而判断该位置差r是否小于预设位置差,如果位置差r小于预设位置差,手机可确定第一操作和第二操作的触摸位置对应。
在一种实现方式中,手机可根据主屏和副屏的相对位置关系,确定主屏坐标系和副屏坐标系的坐标原点的位置关系,根据坐标原点的位置关系,将第一操作在主屏上的触控点的坐标位置转换至副屏坐标系中,计算第一操作转换至副屏上的触控点与第二操作在副屏上的触控点之间的位置差,判断该位置差是否小于预设位置差。
在一种实现方式中,手机还可以将第一操作在主屏上的触控点的坐标位置以及第二操作在副屏上的触控点的坐标位置转换至指定坐标系,该指定坐标系是区别于主屏坐标系和副屏坐标系的坐标系,手机通过计算第一操作转换至指定坐标系的坐标位置以及第二操作转换至指定坐标系的坐标位置的位置差,判断该位置差是否小于预设位置差。
实施例一
以下将以手机为电子设备举例,结合附图详细阐述本申请实施例提供的一种双面屏截图操作的控制方法。
在本实施例中,当手机的柔性屏幕处于折叠状态,且柔性屏幕处于亮屏状态时,手机检测到用于触发屏幕截图的双面屏手势操作,该双面屏手势操作包括对主屏的第一操作和对副屏的第二操作,其中第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应。响应于该双面屏手势操作,手机对处于亮屏状态的柔性屏幕进行屏幕截图。
上述柔性屏幕处于亮屏状态包括以下三种情况:主屏处于亮屏状态,副屏处于黑屏状态;主屏处于黑屏状态,副屏处于亮屏状态;主屏和副屏均处于亮屏状态。
具体的,如图6所示,当手机的柔性屏幕处于折叠状态,且柔性屏幕处于亮屏状态时,手机在检测到对主屏的第一操作和对副屏的第二操作之后,判断主屏和副屏上的触控操作是否至少满足以下两个条件:
第一操作和第二操作的操作时间差小于预设时间差;
第一操作和第二操作的触摸位置对应。
如果上述两个条件都满足,则手机触发对相应屏幕的屏幕截图。
可选的,在一些实施例中,主屏和副屏上的触控操作还应满足:第一操作和第二操作的操作类型相同,如图6所示。也就是说,手机需要判断上述三个条件都满足时,才触发屏幕截图,否则不触发屏幕截图。需要说明的是,判断上述三个条件的执行顺序可以是同时执行,也可以是依次执行,依次执行的执行顺序并不限于上述的执行顺序,本实施例对 上述三个条件的执行顺序不作任何限定。
示例性的,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
上述的按压操作也称为压感操作,应满足以下条件:用户在屏幕上的触控操作的操作时长大于或者等于预设时长,触控操作的触控点的坐标位置未发生变化,触控操作的按压值大于或者等于预设压力值。上述的点击操作应满足:用户在屏幕上的触控操作的操作时长小于预设时长。上述的双击操作应满足:用户在屏幕上触发两次点击操作,两次点击操作的操作时间间隔小于预设时间间隔。上述的长按操作应满足以下条件:用户在屏幕上的触控操作的操作时长大于或者等于预设时长,触控操作的按压值小于预设压力值。
在一些实施例中,手机在确定触发屏幕截图时,可进一步判断用户在触发双面屏手势操作之前的柔性屏幕的亮屏状态,对亮屏状态的一个屏幕或者整个屏幕进行屏幕截图。如图6所示,当主屏处于亮屏状态,副屏处于黑屏状态时,手机可响应于双面屏手势操作,对主屏进行屏幕截图;当主屏处于黑屏状态,副屏处于亮屏状态时,手机可响应于双面屏手势操作,对副屏进行屏幕截图;当主屏和副屏均处于亮屏状态时,手机可响应于双面屏手势操作,对整个柔性屏幕进行屏幕截图。
基于上述方案,用户无需在手机屏幕上调取通知栏或者操作控件(如虚拟按钮),再点击相应截图控件进行屏幕内容截取,用户可以直接使用上述双面屏手势操作,对柔性屏幕进行屏幕截图。手机可智能识别柔性屏幕当前的物理形态(折叠状态或者展开状态)、显示状态(亮屏状态或者黑屏状态)以及双面屏手势操作,进行相应的屏幕截图,为用户提供更好的使用体验。
下面结合附图7和图8对双面屏屏幕截图的使用场景进行详细说明。
在第一种场景中,手机的柔性屏幕处于折叠状态,且主屏处于亮屏状态、副屏处于黑屏状态,用户可以通过双面屏手势操作对处于亮屏状态的主屏进行屏幕截图。
示例性的,如图7所示,手机包括柔性屏幕,该柔性屏幕可被划分为主屏11和副屏12,柔性屏幕当前处于折叠状态。其中主屏11面向用户且处于亮屏状态,主屏11当前显示手机的主界面,主界面上显示有6个应用程序图标,副屏12背向用户且处于黑屏状态。基于此,如果柔性屏幕上的触控器件检测到用户在主屏上的第一操作以及副屏上的第二操作,则触控器件向应用程序框架层中的状态监测服务上报触控事件。状态监测服务接收到触控事件后,可调用传感器服务,启动与主屏对应的传感器以及与副屏对应的传感器进行检测,状态监测服务可根据检测数据确定当前柔性屏幕的物理形态为折叠状态。与此同时,状态监测服务接收到触控事件后,还可通过显示驱动获取主屏和副屏的显示参数,确定在发生触控事件之前柔性屏幕的显示状态,例如图7所示的主屏11处于亮屏状态,副屏12处于黑屏状态。另外,状态监测服务可通过触控器件获取触控事件对应的触控数据,触控数据包括主屏和副屏上触控点的具体位置、操作时间(操作的起始时刻)、操作时长、按压力度等。综上,状态监测服务可向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态为折叠状态,触控事件之前的柔性屏幕的显示状态为主屏处于亮屏状态、副屏处于黑屏状态,以及主屏和副屏上的触控数据。
进而,输入管理服务可根据柔性屏幕的物理形态、显示状态以及触控数据确定是否进行相应的屏幕截图。例如,当柔性屏幕处于折叠状态,主屏处于亮屏状态,副屏处于黑屏状态,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差、第一操作和 第二操作的触摸位置对应、第一操作和第二操作的操作类型为按压操作时,输入管理服务可确定当前需要对主屏进行屏幕截图。随后,输入管理服务向上层应用发送对主屏进行截图的截图事件。如图7所示,上层应用对主屏当前显示的主界面进行屏幕截图,随后,主屏11显示截屏预览界面,截屏预览界面上包括可以对屏幕截图进行操作的处理控件,例如图7中所示的“保存”,“编辑”,“取消”等处理控件,用户可在主屏11上对主屏的屏幕截图进行保存、编辑或者删除操作。上述处理控件仅是一种举例,还可以包括其他处理控件,例如分享控件等,本申请实施例对此不做任何限定。
这样一来,当柔性屏幕处于折叠状态,主屏处于亮屏状态,副屏处于黑屏状态,如果手机检测到用户在主屏上的第一操作和在副屏上的第二操作,确定操作为双面屏手势操作时,手机可自动对主屏进行屏幕截图,方便用户在折叠的柔性屏幕上进行快速截图操作,提升用户使用折叠手机时的使用体验。
在第二种场景中,手机的柔性屏幕处于折叠状态,且主屏处于黑屏状态、副屏处于亮屏状态,用户可以通过双面屏手势操作对处于亮屏状态的副屏进行屏幕截图。其实现原理和技术效果与本实施例的第一种场景类似,此处不再赘述。
在第三种场景中,手机的柔性屏幕处于折叠状态,且主屏和副屏均处于亮屏状态,用户可以通过双面屏手势操作对处于亮屏状态的整个柔性屏幕进行屏幕截图。
示例性的,如图8所示,柔性屏幕当前处于折叠状态,柔性屏幕的主屏11面向用户且处于亮屏状态,主屏11当前显示手机短信列表界面,副屏12背向用户且处于亮屏状态,副屏12当前显示手机短信列表中与联系人Bob的聊天界面。与第一种场景不同的是,柔性屏幕的显示状态为两个屏同时亮,其实现过程与第一种场景类似,当柔性屏幕处于折叠状态,主屏和副屏处于亮屏状态,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差、第一操作和第二操作的触摸位置对应、第一操作和第二操作的操作类型为按压操作时,输入管理服务可确定当前需要对整个柔性屏幕进行屏幕截图。随后,输入管理服务向上层应用发送对整个屏幕进行屏幕截图的截图事件。如图8所示,上层应用对整个柔性屏幕的显示界面(主屏11和副屏12的显示界面)进行屏幕截图,与第一种场景类似,主屏11显示截屏预览界面,截屏预览界面上包括可以对屏幕截图进行操作的处理控件,用户可在主屏11上对整个柔性屏幕的屏幕截图进行保存、编辑或者删除操作。
综上所述,在手机对处于亮屏状态的柔性屏幕进行屏幕截图之后,手机可在处于亮屏状态的屏幕(主屏或者副屏)上显示截屏预览界面。示例性的,截屏预览界面可以包括以下至少一种处理控件:保存控件、编辑控件、分享控件或取消控件。
相应的,在显示截屏预览界面之后,手机在检测到用于触发对截屏预览界面上的一个处理控件选择的第三操作时,响应于第三操作可对屏幕截图执行选择的处理控件对应的执行处理,例如保存、编辑、分享或取消等操作。
在另一些实施例中,手机可根据当前柔性屏幕的物理形态、显示状态、触控数据以及手机的具体朝向,确定是否触发屏幕截图。具体的,在手机确定柔性屏幕处于折叠状态、主屏和副屏均处于亮屏状态,主屏上的第一操作和副屏上的第二操作为双面屏手势操作时,手机可进一步根据当前手机的具体朝向对相应的屏幕进行屏幕截图,例如当前面向用户的屏幕为主屏,即使整个柔性屏幕处于亮屏状态,手机仅对面向用户的主屏进行屏幕截图。
具体的,手机还可调用摄像头、红外传感器、接近光传感器或触控器件等传感器识别 手机的具体朝向。例如,可以在手机的主屏和副屏上分别安装摄像头,如果主屏的摄像头捕捉到人脸信息,而副屏的摄像头没有捕捉到人脸信息,则状态监测服务可确定当前折叠状态的柔性屏幕处于主屏朝向用户的状态。又例如,可以在手机的主屏和副屏上分别安装红外传感器,如果副屏的红外传感器捕捉到人体辐射的红外信号,而主屏的红外传感器没有捕捉到人体辐射的红外信号,则状态监测服务可确定当前折叠状态的柔性屏幕处于副屏朝向用户的状态。又例如,手机还可以根据触控器件上报的用户在主屏和/或副屏的触摸位置,使用预设的抓握算法确定用户当前抓握手机的抓握姿势。那么,结合手机的抓握姿势,状态监测服务也可确定出手机的具体朝向。示例性的,触控器件检测到触控事件后,可将触摸点的坐标上报给状态监测服务。状态监测服务通过统计触摸点在触控器件中的位置和个数确定手机的抓握姿势。例如,如果检测到落在主屏中的触摸点的数目大于预设值,说明用户的手指和手掌抓握在主屏上,则此时朝向用户的屏幕为副屏。相应的,如果检测到落在副屏中的触摸点的数目大于预设值,说明用户的手指和手掌抓握在副屏上,则此时朝向用户的屏幕为主屏。
需要说明的是,手机可同时使用上述一个或多个传感器识别手机的具体朝向。
在上述实施例中,由于增加了手机的具体朝向的判断,即使折叠状态下手机的整个屏幕处于亮屏状态,手机在检测到双面屏手势操作后,只对朝向用户的单个屏幕(主屏或者副屏)进行屏幕截图。
在另一些实施例中,手机还可以响应于如下任一操作:用户双击电源键的操作、用户双击屏幕上的虚拟按键的操作、用户同时按压电源键和音量键的操作等,触发对相应屏幕的屏幕截图。手机在确定上述操作之前的柔性屏幕的显示状态后,对处于亮屏状态的屏幕(主屏、副屏或者整个柔性屏幕)的显示界面进行截图。需要说明的是,该实例对柔性屏幕的物理性态不作限定,即柔性屏幕的物理形态可以是展开状态、折叠状态或者支架状态。
示例性的,表1示出了手机根据柔性屏幕的物理形态、显示状态、主屏和副屏上的触控数据以及手机的具体朝向确定是否触发屏幕截图的响应。
表1
Figure PCTCN2020116714-appb-000001
Figure PCTCN2020116714-appb-000002
需要说明的是,表1中的“不触发屏幕截图”可以理解为手机不作响应(即手机不进行任何处理),或者,手机按照单屏手势操作进行处理。例如,当确定手机柔性屏幕处于折叠状态,主屏亮屏、副屏黑屏,主屏上的第一操作和副屏上的第二操作的触摸位置的位置差,且均为按压操作时,如果第一操作和第二操作的操作时间差大于或者等于预设时间差,手机可以分别根据用户在主屏和副屏上的操作进行相应的事件处理,例如用户按压主屏主界面上的应用图标,手机可响应于主屏的按压操作,显示该应用图标的快捷功能;用户按压处于黑屏状态的副屏,手机可响应于副屏的按压操作,唤醒副屏屏幕。
需要说明的是,本申请实施例中手机的主屏和/或副屏处于亮屏状态时,屏幕的显示界面可以是锁屏界面、某一应用(例如微信、相册等)中的具体界面、主界面等,本申请实施例对此不作任何限制。
实施例二
以下将以手机为电子设备举例,结合附图详细阐述本申请实施例提供的一种双面屏多选操作的控制方法。
在本实施例中,当手机的柔性屏幕处于折叠状态,且柔性屏幕的主屏的显示界面显示至少两个可选控件时,手机检测到用于触发多选的双面屏手势操作,双面屏手势操作包括对主屏的第一操作和对副屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应;响应于双面屏手势操作,手机控制主屏的显示界面进入多选模式。其中,在多选模式中,主屏的显示界面上的至少两个可选控件能够都被选中。
需要说明的是,上述可选控件是指屏幕的显示界面上可以被选中和处理的控件。例如可选控件可以是相册中的图片浏览界面上的图片控件,用户可以同时选中多个图片控件,对多个图片进行拼接处理(即图片编辑),或者,将多个图片批量发送给朋友(即图片共享),或者,将多个图片添加至新的相册文件夹中(即图片整理)。上述多选模式是指用户可以在屏幕的显示界面上选择至少两个可选控件,并进行批量处理的操作模式。
具体的,如图9所示,当手机的柔性屏幕处于折叠状态,且柔性屏幕的主屏的显示界面显示至少两个可选控件时,手机在检测到对主屏的第一操作和对副屏的第二操作之后,可以根据第一操作和第二操作在屏幕上的触摸位置和操作时间,判断主屏和副屏上的触控操作是否至少满足以下两个条件:
第一操作和第二操作的操作时间差小于预设时间差;
第一操作和第二操作的触摸位置对应。
如果上述两个条件都满足,则手机控制主屏的显示界面进入多选模式。
可选的,在一些实施例中,主屏和副屏上的触控操作还应满足:第一操作和第二操作的操作类型相同,如图9所示。也就是说,手机需要判断上述三个条件都满足时,才触发主屏的显示界面进入多选模式,否则主屏的显示界面不进入多选模式。
需要说明的是,判断上述三个条件的执行顺序可以是同时执行,也可以是依次执行,依次执行的执行顺序并不限于上述的执行顺序,本实施例对上述三个条件的执行顺序不作任何限定。
示例性的,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。关于各个操作类型的定义和判断,可参见上文,此处不再赘述。
在一些实施例中,手机在确定触发主屏的显示界面进入多选模式时,可进一步判断用户在主屏上的第一操作的触摸位置是否对应第一可选控件,如图9所示,若第一操作的触摸位置对应第一可选控件,第一可选控件为主屏上显示的其中一个可选控件,响应于双面屏手势操作,手机可控制主屏的显示界面进入多选模式,并将该第一可选控件显示为选中。用户可以通过上述方式,在触发主屏的显示界面进入多选模式的同时,直接选中主屏上的一个可选控件。
在本实施例中,上述柔性屏幕的主屏的显示界面显示至少两个可选控件,包括以下几种情况:第一种情况下,柔性屏幕的主屏的显示界面显示至少两个可选控件(即主屏处于亮屏状态),且柔性屏幕的副屏处于黑屏状态。第二种情况下,柔性屏幕的主屏的显示界面显示至少两个可选控件,且柔性屏幕的副屏处于亮屏状态,但副屏的显示界面没有可选控件。第三种情况下,柔性屏幕的主屏和副屏的显示界面上均显示至少两个可选控件。
上述前两种情况下,折叠态手机检测到用于触发多选的双面屏手势操作,响应于双面屏手势操作,手机可控制主屏的显示界面进入多选模式。
上述第三种情况下,折叠态手机检测到用于触发多选的双面屏手势操作,响应于双面屏手势操作,手机可控制面向用户的屏幕的显示界面进入多选模式,其中,面向用户的屏幕为主屏或者副屏。也就是说,折叠态手机的两个屏的显示界面均有可选控件时,手机可根据当前手机的具体朝向确定哪个屏幕的显示界面进入多选模式,若面向用户的屏幕为主屏,则手机控制主屏的显示界面进入多选模式,若面向用户的屏幕为副屏,则手机控制副屏的显示界面进入多选模式。
在一些实施例中,响应于双面屏手势操作,手机控制主屏的显示界面进入多选模式,包括:响应于双面屏手势操作,若主屏为面向用户的屏幕,手机控制主屏的显示界面进入多选模式。该实现方式,手机还需要检测当前手机的具体朝向,进一步根据手机的具体朝向确定是否控制主屏的显示界面进入多选模式。也就是说,如果手机待控制的屏幕背向用户,则手机不触发相应的控制,从而避免用户的误操作。
示例性的,若手机在折叠状态下,主屏的显示界面上显示至少两个可选控件,副屏处于黑屏状态,或者副屏处于亮屏状态但显示界面上没有可选控件,此时手机检测到双面屏手势操作,还需确认当前手机的具体朝向,如果面向用户的屏幕不是显示可选控件的主屏,手机将不会触发主屏的显示界面进入多选模式。
在一些实施例中,响应于双面屏手势操作,手机控制主屏的显示界面进入多选模式之后,还包括:手机检测到用于触发对主屏的显示界面上的可选控件选择的第三操作,第三操作的操作类型为点击操作或滑动操作;响应于第三操作,手机将第三操作对应的一个或多个控件显示为选中。用户可以通过该实现方式,增加一个或多个可选控件,完成多选操作。
基于上述方案,在用户需要对折叠的柔性屏幕的显示界面上的至少两个可选控件进行批量处理时,手机可根据检测到的双面屏手势操作,控制手机的柔性屏幕的显示界面快速进入多选模式。这样,用户无需通过点击指定位置上的“选择”控件(例如屏幕左上角)进入多选模式。上述双面屏手势操作更加方便、快捷,为用户提供更好的使用体验。
下面结合附图10和图11对双面屏多选操作的使用场景进行详细说明。
在第一种场景中,手机的柔性屏幕处于折叠状态,主屏的显示界面上显示至少两个可选控件(主屏处于亮屏状态)、且副屏处于黑屏状态下,用户可以通过双面屏手势操作触发主屏的显示界面进入多选模式,从而完成多选操作。
示例性的,如图10所示,手机的柔性屏幕被划分为主屏11和副屏12,柔性屏幕当前处于折叠状态。其中主屏11面向用户且处于亮屏状态,主屏11当前的显示界面为图片浏览界面,该界面上显示有9个图片控件,副屏12背向用户且处于黑屏状态。基于此,如果柔性屏幕上的触控器件检测到用户在主屏上的第一操作以及副屏上的第二操作,与上述实施例类似,状态监测服务在获取到当前柔性屏幕的物理形态、显示状态以及主屏和副屏上的触控数据后,可向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态以及主屏和副屏上的触控数据。
进而,输入管理服务可根据柔性屏幕的物理形态、显示状态以及触控数据确定是否触发主屏的显示界面进行多选模式。例如,当柔性屏幕处于折叠状态,主屏的显示界面上显示有至少两个可选控件,副屏处于黑屏状态,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差、第一操作和第二操作的触摸位置对应、第一操作和第二操作的操作类型为点击操作时,输入管理服务可确定当前主屏的显示界面需要进入多选模式。随后,输入管理服务向上层应用发送用于指示主屏的显示界面进入多选模式的事件。在一些实施例中,输入管理服务还可以进一步判断主屏上的第一操作是否对应主屏的显示界面上的可选控件,如果第一操作对应一个可选控件,如图10所示,第一操作对应图片浏览界面上第二行第一列的图片控件,上层应用根据输入管理服务的事件指示,触发主屏的显示界面进入多选模式,且勾选用户在主屏上选中的图片。
这样一来,当柔性屏幕处于折叠状态,主屏的显示界面上显示至少两个可选控件(主屏处于亮屏状态)、且副屏处于黑屏状态下,如果手机检测到用户在主屏上的第一操作和在副屏上的第二操作,确定操作为双面屏手势操作时,手机主屏的显示界面可快速进入多选模式,如果用户在主屏上的第一操作还对应一可选控件,手机将同时勾选用户在主屏上选中的可选控件,方便用户在折叠的柔性屏幕上快速进行多选操作,提升用户使用折叠手机时的使用体验。
在第二种场景中,手机的柔性屏幕处于折叠状态,副屏的显示界面上显示至少两个可选控件(副屏处于亮屏状态)、且主屏处于黑屏状态下,用户可以通过双面屏手势操作对处于亮屏状态的副屏上的可选控件进行多选操作。其实现原理和技术效果与本实施例的第一种场景类似,此处不再赘述。
在第三种场景中,手机的柔性屏幕处于折叠状态,主屏和副屏的显示界面上均显示有至少两个可选控件,手机响应于双面屏手势操作,控制面向用户的屏幕的显示界面进入多选模式,用户可在多选模式下,进行多选操作。
示例性的,如图11所示,柔性屏幕当前处于折叠状态,柔性屏幕的主屏11面向用户且主屏11当前显示主界面,主界面上有6个应用图标控件,副屏12背向用户且副屏12当前显示相册浏览界面,相册浏览界面上有12个图片控件。基于此,状态监测服务在接收到用户在主屏和副屏上的触控操作后,除了向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态、触控数据之外,还发送手机的具体朝向。关于状态监测服务如 何确定手机的具体朝向可参见上述实施例。进而,输入管理服务可根据柔性屏幕的物理形态、显示状态、触控数据以及手机的具体朝向,确定哪个屏幕的显示界面进入多选模式。如图11所示,当前面向用户的屏幕为主屏11,手机控制主屏11的显示界面进入多选模式,由于在主屏11上的第一操作对应应用图标4,则主屏11的显示界面进入多选模式的同时,应用图标4同时被选中,此时背向用户的副屏12不作响应(显示界面无变化)。
示例性的,表2示出了手机根据柔性屏幕的物理形态、显示状态、主屏和副屏上的触控数据以及手机的具体朝向确定是否触发界面进入多选模式的响应。
表2
Figure PCTCN2020116714-appb-000003
需要说明的是,不论是表2中的“主屏的显示界面进入多选模式”,还是“副屏的显 示界面进入多选模式”,都不要求用户在主屏和副屏上的触控操作对应界面上的可选控件(即触控操作可以在显示界面的空白位置)。如果主屏或副屏上的触控操作对应界面上的某一可选控件,则主屏或副屏的显示界面进入多选模式的同时,该可选控件可直接显示为选中。
需要说明的是,本申请实施例中触发屏幕的显示界面进入多选模式的前提是:屏幕(主屏和/或副屏)的显示界面上显示有至少两个多选控件。其中屏幕的显示界面可以是某一应用的具体界面,例如相册应用的图片浏览界面,图片浏览界面显示有多个图片控件,还可以是手机的主界面,主界面上显示有多个应用图标控件。
实施例三
以下将以手机为电子设备举例,结合附图详细阐述本申请实施例提供的一种双面屏图标移动操作的控制方法。
在本实施例中,当手机的柔性屏幕处于折叠状态,柔性屏幕的主屏和副屏的显示界面均为可容纳图标控件的界面(主屏和副屏均处于亮屏状态),且主屏的显示界面上显示有图标控件时,手机检测到用于触发图标移动的双面屏手势操作,该双面屏手势操作包括对主屏的第一操作和对副屏的第二操作,其中第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应,第一操作对应主屏上的第一图标控件。响应于该双面屏手势操作,手机将第一图标控件从主屏移动至副屏。
需要说明的是,上述图标控件是可以被移动的图标控件,显示界面上的图标控件可以按照九宫格、十六宫格等形式排布。具体的,上述图标控件可以是手机主界面上的应用程序的图标控件,也可以是某一应用中的图标控件,对此本申请不做限定。
具体的,如图12所示,当手机的柔性屏幕处于折叠状态,柔性屏幕的主屏和副屏的显示界面均为可容纳图标控件的界面,且主屏的显示界面上显示有图标控件时,手机在检测到对主屏的第一操作和对副屏的第二操作之后,可以根据第一操作和第二操作在屏幕上的触摸位置和操作时间,判断主屏和副屏上的触控操作是否至少满足以下三个条件:
第一操作和第二操作的操作时间差小于预设时间差;
第一操作和第二操作的触摸位置对应;
第一操作对应主屏上的第一图标控件。
如果上述三个条件都满足,则手机将第一图标控件从主屏移动至副屏。
可选的,在一些实施例中,主屏和副屏上的触控操作还应满足:第一操作和第二操作的操作类型相同,如图12所示。也就是说,手机需要判断上述四个条件都满足时,才触发将第一图标控件从主屏移动至副屏。需要说明的是,判断上述四个条件的执行顺序可以是同时执行,也可以是依次执行,依次执行的执行顺序并不限于上述的执行顺序,本实施例对上述四个条件的执行顺序不作任何限定。
示例性的,操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。关于各个操作类型的定义和判断,可参见上文,此处不再赘述。
在一些实施例中,响应于双面屏手势操作,手机将第一图标控件从主屏移动至副屏,包括:响应于双面屏手势操作,若主屏为面向用户的屏幕,手机将第一图标控件从主屏移动至副屏。该实现方式,手机还需要检测当前手机的具体朝向,进一步根据手机的具体朝向确定是否移动第一图标控件。也就是说,如果手机确定待移动的第一图标控件所在的屏 幕背向用户,则手机不触发图标移动,从而避免用户的误操作。
需要说明的是,柔性屏幕的主屏和副屏的显示界面均为可容纳图标控件的界面,上述柔性屏幕的主屏的显示界面上显示有图标控件,包括以下两种情况:第一种情况下,柔性屏幕的主屏的显示界面上显示有图标控件(即主屏处于亮屏状态),且柔性屏幕的副屏的显示界面为空屏(即没有图标控件)。此时,手机检测到用于触发图标移动的双面屏手势操作,响应于该双面屏手势操作,手机可将第一图标控件从主屏移动至副屏。第二种情况下,柔性屏幕的主屏的显示界面上显示有图标控件,且柔性屏幕的副屏的显示界面上显示有图标控件。此时,手机检测到用于触发图标移动的双面屏手势操作,如果只有主屏上的第一操作对应第一图标控件,则手机将第一图标控件从主屏移动至副屏;如果主屏上的第一操作对应第一图标控件,且副屏上的第二操作对应第二图标控件,手机可控制面向用户的屏幕上的图标控件移动至背向用户的屏幕,也就是说,手机需要根据当前手机的具体朝向确定哪个屏幕上的图标控件进行跨屏幕移动,若面向用户的屏幕为主屏,则手机将第一图标控件从主屏移动至副屏,若面向用户的屏幕为副屏,则手机将第二图标控件从副屏移动至主屏。
在一些实施例中,当主屏和副屏的显示界面均为可容纳图标控件的界面,主屏的显示界面上显示至少一个图标控件,且副屏的显示界面上显示至少一个图标控件时,响应于双面屏手势操作,手机将第一图标控件从主屏移动至副屏上的至少一个图标控件的前面,并将副屏上的至少一个图标控件顺序后移;或者,响应于双面屏手势操作,手机将第一图标控件从主屏移动至副屏上的至少一个图标控件的后面。
该实现方式限定了第一图标控件移动至副屏的位置,如果副屏上原本就有多个图标控件,可以将第一图标控件移动至多个图标控件的最前面或者最后面。例如,副屏上原本有顺序排列的两个图标控件A和B,A位于B的前面,手机可以将主屏上的图标控件C移动至A的位置,A和B顺序后移,或者,C移动至B的后面,A和B在副屏的位置不变。
在一些实施例中,手机将第一图标控件从主屏移动至副屏,包括:手机将第一图标控件从主屏移动至副屏的第二操作的触摸位置。需要说明的是,副屏的第二操作的触摸位置处可能没有图标控件,也可能有图标控件。如果没有图标控件,手机可以直接将第一图标控件移动至副屏的第二操作的触摸位置。如果有图标控件,可以参见下述方案。
在一些实施例中,若副屏的触摸位置上有第三图标控件,手机将第一图标控件从主屏移动至副屏的第二操作的触摸位置,包括:手机将第一图标控件从主屏移动至副屏的第二操作的触摸位置,并将副屏上的第三图标控件后移;或者,手机将第一图标控件从主屏移动至副屏的第二操作的触摸位置,并将副屏上的第三图标控件和第一图标控件合并。
也就是说,如果副屏的第二操作的触摸位置有图标控件,被移动的第一图标控件仍放置在该触摸位置,手机需要将该触摸位置之后的所有图标控件顺序后移,或者,被移动的第一图标控件直接与该触摸位置的图标控件合并至同一文件夹。
基于上述方案,在需要对柔性屏幕的显示界面上的图标控件进行跨屏幕移动时,如果柔性屏幕当前的物理形态为折叠状态,双屏显示界面均可容纳图标控件,手机可根据用户在柔性屏幕上的双面屏手势操作,控制操作对应的图标控件进行跨屏幕移动。这样,用户可以在不展开手机的情况下,将操作对应的图标控件移动至另一个屏幕上。上述双面屏手势操作更加方便、快捷,为用户提供更好的使用体验。
下面结合附图13a至图13c对双面屏图标移动操作的使用场景进行详细说明。
在第一种场景中,手机的柔性屏幕处于折叠状态,主屏和副屏的显示界面均为可容纳图标控件的界面,且主屏的显示界面上显示有图标控件时,用户可以通过双面屏手势操作将主屏上的图标控件移动至副屏。
示例性的,如图13a所示,手机的柔性屏幕被划分为主屏11和副屏12,柔性屏幕当前处于折叠状态。其中主屏11面向用户且处于亮屏状态,主屏11当前显示手机的第一主界面,该第一主界面包括4个应用程序图标。副屏12背向用户且处于亮屏状态,副屏12显示手机的第二主界面,该第二主界面包括3个应用程序图标。基于此,如果柔性屏幕上的触控器件检测到用户在主屏11上的第一操作以及副屏12上的第二触控操作,与上述实施例类似,状态监测服务在获取到当前柔性屏幕的物理形态、显示状态以及主屏11和副屏12上的触控数据后,可向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态以及主屏11和副屏12上的触控数据。
进而,输入管理服务可根据柔性屏幕的物理形态、显示状态以及触控数据确定是否触发图标跨屏幕移动。例如,当柔性屏幕处于折叠状态,主屏和副屏的显示界面上均显示有图标控件,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差,第一操作和第二操作的触摸位置对应,主屏上的第一操作对应图标控件(如图13a中主屏上的应用图标4)、第一操作和第二操作的操作类型相同(如图13a中长按操作)时,输入管理服务可确定需要将当前主屏上的第一操作对应的图标控件移动至副屏,输入管理服务可向上层应用发送对主屏图标移动的事件。如图13a所示,手机可将主屏11上的应用图标4移动至副屏12上的对应位置处。如果副屏12上的对应位置处已有应用图标8,如图13b所示,手机可将主屏11上的应用图标4与应用图标8合并至一个文件夹中,或者,如图13c所示,手机可将主屏11上的应用图标4移动至应用图标8的位置,应用图标8顺序后移。
这样一来,当柔性屏幕处于折叠状态,主屏和副屏的显示界面上均显示有图标控件时,如果手机检测到用户在主屏和副屏上的第一操作和第二操作,第一操作对应主屏上的某图标控件,手机在确定上述操作为双面屏手势操作时,可将主屏上的图标控件快速移动至副屏的对应位置,方便用户在手机处于折叠状态下对屏幕上的图标控件进行跨屏幕移动,提升用户使用折叠手机时的使用体验。
在第二种场景中,手机的柔性屏幕处于折叠状态,主屏和副屏的显示界面均为可容纳图标控件的界面,且副屏的显示界面上显示有图标控件时,用户可以通过双面屏手势操作将副屏上的图标控件移动至主屏。其实现原理和技术效果与本实施例的第一种场景类似,此处不再赘述。
在第三种场景中,手机的柔性屏幕处于折叠状态,主屏和副屏的显示界面均为可容纳图标控件的界面,且主屏和副屏的显示界面上均显示有图标控件时,用户可以通过双面屏手势操作将主屏(或副屏)上的图标控件移动至副屏(或主屏)。在该场景中,状态监测服务在接收到用户在主屏和副屏上的触控操作后,除了向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态、触控数据之外,还发送手机的具体朝向。关于状态监测服务如何确定手机的具体朝向可参见上述实施例。
进而,输入管理服务可根据柔性屏幕的物理形态、显示状态、触控数据以及手机的具体朝向,确定是否触发图标跨屏幕移动。例如,当柔性屏幕处于折叠状态,主屏和副屏的 显示界面均为可容纳图标控件的界面,且主屏和副屏上的触控操作均对应图标控件,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应,第一操作和第二操作的操作类型均为长按操作时,输入管理服务可进一步根据手机的具体朝向,确定是将主屏的图标控件移动至副屏,还是将副屏的图标控件移动至主屏。如果当前主屏面向用户,则手机将主屏的图标控件进行跨屏幕移动;如果当前副屏面向用户,则手机将副屏的图标控件进行跨屏幕移动。
示例性的,表3示出了手机根据柔性屏幕的物理形态、显示状态、主屏和副屏上的触控数据以及手机的具体朝向确定是否触发图标移动的响应。
表3
Figure PCTCN2020116714-appb-000004
实施例四
以下将以手机为电子设备举例,结合附图详细阐述本申请实施例提供的一种双面屏删除操作的控制方法。
在本实施例中,当手机的柔性屏幕处于折叠状态,且柔性屏幕的主屏上显示有可选控件时,手机检测到触发删除控件的双面屏手势操作,双面屏手势操作包括对主屏的第一操作和对副屏的第二操作,第一操作和第二操作的操作时间差小于预设时间差,第一操作的触摸位置与第二操作的触摸位置对应,第一操作对应主屏上的第一可选控件,第一操作和第二操作的操作类型均为滑动操作且滑动方向一致;响应于上述双面屏手势操作,手机将主屏上的第一可选控件删除。
需要说明的是,上述可选控件是指屏幕的显示界面上可以被删除的控件,例如可选控件可以是聊天列表界面中的会话框控件,或者,文件列表界面中的文件夹控件,或者手机主界面的应用图标控件等等。上述滑动方向可以是水平方向、垂直方向、与水平方向或垂直方向具有一定夹角的其他方向,对此本申请不做限定。
具体的,如图14所示,当手机的柔性屏幕处于折叠状态,且柔性屏幕的主屏的显示界面上显示有可选控件时,手机在检测到对主屏的第一操作和对副屏的第二操作之后,可以根据第一操作和第二操作在屏幕上的触摸位置和操作时间,判断主屏和副屏上的触控操作是否满足以下四个条件:
第一操作和第二操作的操作时间差小于预设时间差;
第一操作和第二操作的触摸位置对应;
第一操作对应主屏上的第一可选控件;
第一操作和第二操作的操作类型均为滑动操作且滑动方向一致。
如果上述四个条件都满足,则手机将主屏上的第一可选控件删除,否则手机不触发对第一可选控件的删除。
需要说明的是,判断上述四个条件的执行顺序可以是同时执行,也可以是依次执行,依次执行的执行顺序并不限于上述的执行顺序,本实施例对上述三个条件的执行顺序不作任何限定。
在一些实施例中,响应于双面屏手势操作,手机将主屏上的第一可选控件删除,包括:响应于双面屏手势操作,若主屏为面向用户的屏幕,手机将主屏上的第一可选控件删除。该实现方式,手机还需要检测当前手机的具体朝向,进一步根据手机的具体朝向确定是否将主屏上的第一可选控件删除。也就是说,如果手机确定待删除的第一可选控件所在的屏幕背向用户,则手机不触发删除控件,从而避免用户的误操作。
需要说明的是,上述柔性屏幕的主屏的显示界面上显示有可选控件,包括以下几种情况:第一种情况下,柔性屏幕的主屏的显示界面上显示有可选控件(即主屏处于亮屏状态),且柔性屏幕的副屏处于黑屏状态。此时,手机检测到用于触发删除控件的双面屏手势操作,响应于双面屏手势操作,手机可将主屏上的可选控件删除。第二种情况下,柔性屏幕的主屏的显示界面上显示有可选控件,且柔性屏幕的副屏处于亮屏状态,但副屏的显示界面没有可选控件。第三种情况下,柔性屏幕的主屏和副屏的显示界面上均显示有可选控件。此时,手机检测到用于触发删除控件的双面屏手势操作,如果主屏上的第一操作对应第一可 选控件,且副屏上的第二操作对应第二可选控件,手机将面向用户的屏幕上的操作对应的可选控件删除。也就是说,手机需要根据当前手机的具体朝向确定哪个屏幕上的可选控件应被删除。若面向用户的屏幕为主屏,手机将主屏上的第一可选控件删除;若面向用户的屏幕为副屏,手机将副屏上的第二可选控件删除。
在一些实施例中,响应于双面屏手势操作,手机将主屏上的第一可选控件删除之后,还包括:若第一可选控件的下方有一个或多个第三可选控件,手机将一个或多个第三可选控件顺序上移。
基于上述方案,在需要删除柔性屏幕的显示界面上的可选控件时,如果柔性屏幕当前的物理形态为折叠状态,手机可根据用户在柔性屏幕上的双面屏手势操作,将操作对应的可选控件删除。这样,用户可以使用双面屏手势将可选控件滑动至屏幕边缘或者屏幕上指定区域,实现对可选控件的快速删除,还可以避免用户单屏滑动对控件的误删除操作。
下面结合附图15a、图15b、图16对双面屏删除操作的使用场景进行详细说明。
在第一种场景中,手机的柔性屏幕处于折叠状态,主屏的显示界面上显示有可选控件(主屏处于亮屏状态)、且副屏处于黑屏状态下,用户可以通过双面屏手势操作将主屏上的可选控件删除。
示例性的,如图15a或图15b所示,手机柔性屏幕被划分为主屏11和副屏12,柔性屏幕当前处于折叠状态。其中主屏11面向用户且处于亮屏状态,主屏11当前显示微信的聊天列表界面,该界面包括3个会话框控件,副屏12背向用户且处于黑屏状态。基于此,如果柔性屏幕上的触控器件检测到用户在主屏11上的第一操作以及副屏12上的第二操作,与上述实施例类似,状态监测服务在获取到当前柔性屏幕的物理形态、显示状态以及主屏11和副屏12上的触控数据后,可向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态以及主屏11和副屏12上的触控数据。
进而,输入管理服务可根据柔性屏幕的物理形态、显示状态以及触控数据确定是否触发删除操作。例如,当柔性屏幕处于折叠状态,主屏的显示界面上显示有可选控件,副屏处于黑屏状态,主屏上的第一操作和副屏上的第二操作的操作时间差小于预设时间差、第一操作和第二操作的触摸位置对应、第一操作对应主屏上的第一可选控件、第一操作和第二操作的操作类型为滑动操作且滑动方向一致时,输入管理服务可确定需要将主屏上的第一可选控件删除。随后,输入管理服务向上层应用发送用于指示删除主屏上的第一可选控件的事件。如图15a所示,用户使用双面屏手势操作将主屏上的会话框2从左向右滑动第一距离,或滑动至屏幕的右边缘,或者,如图15b所示,用户将主屏上的会话框2从上向下滑动第二距离,或滑动至屏幕的下边缘,手机可响应于双面屏手势操作,将用户选中的会话框2从主屏的显示界面上删除,会话框2下方的其他会话框(如会话框3)顺序上移。
这样一来,当柔性屏幕处于折叠状态,主屏的显示界面上显示有可选控件(主屏处于亮屏状态),且副屏处于黑屏状态下,如果手机检测到用户在主屏上的第一操作和在副屏上的第二操作,第一操作对应主屏上的某可选控件,手机在确定上述操作为双面屏手势操作时,可将用户在主屏上选中的可选控件删除,方便用户在折叠的柔性屏幕上快速进行删除操作,提升用户使用折叠手机时的使用体验。
在第二种场景中,手机的柔性屏幕处于折叠状态,副屏的显示界面上显示有可选控件(副屏处于亮屏状态)、且主屏处于黑屏状态下,用户可以通过双面屏手势操作对处于亮 屏状态的副屏上的可选控件进行删除操作。其实现原理和技术效果与本实施例的第一种场景类似,此处不再赘述。
在第三种场景中,手机的柔性屏幕处于折叠状态,主屏和副屏的显示界面上均显示有可选控件,手机响应于双面屏手势操作将面向用户的屏幕上的可选控件删除。
示例性的,如图16所示,柔性屏幕当前处于折叠状态,柔性屏幕的主屏11面向用户且主屏11当前显示文件列表界面,该界面包括2个文件夹控件,柔性屏幕的副屏12背向用户且副屏12当前显示微信的聊天列表界面,该界面包括3个会话框控件。基于此,状态监测服务在接收到用户在主屏和副屏上的触控操作后,除了向应用程序框架层中的输入管理服务发送当前柔性屏幕的物理形态、显示状态、触控数据之外,还发送手机的具体朝向。关于状态监测服务如何确定手机的具体朝向可参见上述实施例。进而,输入管理服务可根据柔性屏幕的物理形态、显示状态、触控数据以及手机的具体朝向,确定删除哪个屏幕上的可选控件。如图16所示,当前面向用户的屏幕为主屏11,手机将删除主屏11上的文件夹2。
示例性的,表4示出了手机根据柔性屏幕的物理形态、显示状态、主屏和副屏上的触控数据以及手机的具体朝向确定是否触发控件删除的响应。
Figure PCTCN2020116714-appb-000005
Figure PCTCN2020116714-appb-000006
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、输入设备和输出设备。其中,输入设备和输出设备可集成为一个设备,例如,可将柔性屏幕的触控器件作为输入设备,将柔性屏幕的显示器作为输出设备。
此时,如图17所示,上述电子设备可以包括:柔性屏幕1701,所述柔性屏幕1701包括触控器件1706和显示器1707;一个或多个处理器1702;一个或多个存储器1703;一个或多个传感器1708;存储器1703存储有一个或多个应用程序(未示出)以及一个或多个程序1704,上述各器件可以通过一个或多个通信总线1705通信。其中该一个或多个程序1704被存储在上述存储器1703中并被配置为被该一个或多个处理器1702执行,以使得电子设备执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。
示例性的,上述处理器1702具体可以为图1所示的处理器110,上述存储器1703具体可以为图1所示的内部存储器121和/或外部存储器120,上述柔性屏幕1701具体可以为图1所示的显示屏幕301,上述传感器1708具体可以为图1所示的传感器模块180中的陀螺仪传感器180B、加速度传感器180E、接近光传感器180G,还可以是红外传感器等,本申请实施例对此不做任何限制。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (27)

  1. 一种具有柔性屏幕的电子设备的控制方法,其特征在于,所述柔性屏幕的物理形态包括展开状态和折叠状态,当所述柔性屏幕处于折叠状态时,所述柔性屏幕被划分为第一屏和第二屏,所述方法包括:
    当所述柔性屏幕处于亮屏状态时,所述电子设备检测到用于触发屏幕截图的双面屏手势操作,所述双面屏手势操作包括对所述第一屏的第一操作和对所述第二屏的第二操作,所述第一操作和所述第二操作的操作时间差小于预设时间差,所述第一操作的触摸位置与所述第二操作的触摸位置对应;
    响应于所述双面屏手势操作,所述电子设备对处于亮屏状态的所述柔性屏幕进行屏幕截图。
  2. 根据权利要求1所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备对处于亮屏状态的所述柔性屏幕进行屏幕截图,包括:
    若所述第一屏处于亮屏状态,所述第二屏处于黑屏状态,响应于所述双面屏手势操作,所述电子设备对所述第一屏进行屏幕截图;
    若所述第一屏处于黑屏状态,所述第二屏处于亮屏状态,响应于所述双面屏手势操作,所述电子设备对所述第二屏进行屏幕截图;
    若所述第一屏和所述第二屏均处于亮屏状态,响应于所述双面屏手势操作,所述电子设备对整个所述柔性屏幕进行屏幕截图。
  3. 根据权利要求1或2所述的方法,其特征在于,在所述电子设备对处于亮屏状态的所述柔性屏幕进行屏幕截图之后,还包括:显示截屏预览界面;所述截屏预览界面包括以下至少一种处理控件:保存控件、编辑控件、分享控件或取消控件。
  4. 根据权利要求3所述的方法,其特征在于,所述显示截屏预览界面,包括:
    若所述第一屏处于亮屏状态,所述第二屏处于黑屏状态,在所述第一屏上显示所述截屏预览界面;
    若所述第一屏处于黑屏状态,所述第二屏处于亮屏状态,在所述第二屏上显示所述截屏预览界面;
    若所述第一屏和所述第二屏均处于亮屏状态,在面向用户的屏幕上显示所述截屏预览界面,所述面向用户的屏幕为所述第一屏或所述第二屏。
  5. 根据权利要求3或4所述的方法,其特征在于,在显示所述截屏预览界面之后,还包括:
    所述电子设备检测到用于触发对所述截屏预览界面上的一个处理控件选择的第三操作;
    响应于所述第三操作,所述电子设备对屏幕截图执行所述选择的处理控件对应的执行处理。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述第一操作和所述第二操作的操作类型相同,所述操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
  7. 一种具有柔性屏幕的电子设备的控制方法,其特征在于,所述柔性屏幕的物理形态包括展开状态和折叠状态,当所述柔性屏幕处于折叠状态时,所述柔性屏幕被划分为第 一屏和第二屏,所述方法包括:
    当所述第一屏的显示界面上显示至少两个可选控件时,所述电子设备检测到用于触发多选的双面屏手势操作,所述双面屏手势操作包括对所述第一屏的第一操作和对所述第二屏的第二操作,所述第一操作和所述第二操作的操作时间差小于预设时间差,所述第一操作的触摸位置与所述第二操作的触摸位置对应;
    响应于所述双面屏手势操作,所述电子设备控制所述第一屏的显示界面进入多选模式,其中,在所述多选模式中,所述第一屏的显示界面上的所述至少两个可选控件能够都被选中。
  8. 根据权利要求7所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备控制所述第一屏的显示界面进入多选模式,包括:
    响应于所述双面屏手势操作,若所述第一屏为面向用户的屏幕,所述电子设备控制所述第一屏的显示界面进入多选模式。
  9. 根据权利要求7所述的方法,其特征在于,当所述第一屏的显示界面上显示至少两个可选控件,且所述第二屏的显示界面上显示至少两个可选控件时,响应于所述双面屏手势操作,所述电子设备控制面向用户的屏幕的显示界面进入多选模式,所述面向用户的屏幕为所述第一屏或所述第二屏。
  10. 根据权利要求7-9中任一项所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备控制所述第一屏的显示界面进入多选模式,包括:
    若对所述第一屏的第一操作的触摸位置对应第一可选控件,所述第一可选控件为所述第一屏上显示的其中一个可选控件,响应于所述双面屏手势操作,所述电子设备控制所述第一屏的显示界面进入多选模式,并将所述第一可选控件显示为选中。
  11. 根据权利要求7-10中任一项所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备控制所述第一屏的显示界面进入多选模式之后,还包括:
    所述电子设备检测到用于触发对所述第一屏的显示界面上的可选控件选择的第三操作,所述第三操作的操作类型为点击操作或滑动操作;
    响应于所述第三操作,所述电子设备将所述第三操作对应的一个或多个控件显示为选中。
  12. 根据权利要求7-11中任一项所述的方法,其特征在于,所述第一操作和所述第二操作的操作类型相同,所述操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
  13. 一种具有柔性屏幕的电子设备的控制方法,其特征在于,所述柔性屏幕的物理形态包括展开状态和折叠状态,当所述柔性屏幕处于折叠状态时,所述柔性屏幕被划分为第一屏和第二屏,所述方法包括:
    当所述第一屏和所述第二屏的显示界面均为可容纳图标控件的界面,且所述第一屏的显示界面上显示有图标控件时,所述电子设备检测到用于触发图标移动的双面屏手势操作,所述双面屏手势操作包括对所述第一屏的第一操作和对所述第二屏的第二操作,所述第一操作和所述第二操作的操作时间差小于预设时间差,所述第一操作的触摸位置与所述第二操作的触摸位置对应,所述第一操作对应所述第一屏上的第一图标控件;
    响应于所述双面屏手势操作,所述电子设备将所述第一图标控件从所述第一屏移动至 所述第二屏。
  14. 根据权利要求13所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏,包括:
    响应于所述双面屏手势操作,若所述第一屏为面向用户的屏幕,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏。
  15. 根据权利要求13所述的方法,其特征在于,当所述第一屏和所述第二屏的显示界面均为可容纳图标控件的界面,所述第一屏的显示界面上显示至少一个图标控件,且所述第二屏的显示界面上显示至少一个图标控件时,
    若所述第一操作对应所述第一屏的显示界面上的第一图标控件,且所述第二操作对应所述第二屏的显示界面上的第二图标控件,响应于所述双面屏手势操作,若所述第一屏面向用户,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏;若所述第二屏面向用户,所述电子设备将所述第二图标控件从所述第二屏移动至所述第一屏。
  16. 根据权利要求13-15中任一项所述的方法,其特征在于,当所述第一屏和所述第二屏的显示界面均为可容纳图标控件的界面,所述第一屏的显示界面上显示至少一个图标控件,且所述第二屏的显示界面上显示至少一个图标控件时,
    响应于所述双面屏手势操作,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏上的至少一个图标控件的前面,并将所述第二屏上的至少一个图标控件顺序后移;
    或者,
    响应于所述双面屏手势操作,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏上的至少一个图标控件的后面。
  17. 根据权利要求13-15中任一项所述的方法,其特征在于,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏,包括:
    所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏的所述第二操作的触摸位置。
  18. 根据权利要求17所述的方法,其特征在于,若所述第二屏的触摸位置上有第三图标控件,所述电子设备将所述第一图标控件从所述第一屏移动至所述第二屏的所述第二操作的触摸位置,包括:
    所述电子设备将所述第一图标控件从第一屏移动至所述第二屏的所述第二操作的触摸位置,并将所述第二屏上的所述第三图标控件后移;或者
    所述电子设备将所述第一图标控件从第一屏移动至所述第二屏的所述第二操作的触摸位置,并将所述第二屏上的所述第三图标控件和所述第一图标控件合并。
  19. 根据权利要求13-18中任一项所述的方法,其特征在于,所述第一操作和所述第二操作的操作类型相同,所述操作类型包括以下任意一种:按压操作,点击操作,双击操作,长按操作。
  20. 一种具有柔性屏幕的电子设备的控制方法,其特征在于,所述柔性屏幕的物理形态包括展开状态和折叠状态,当所述柔性屏幕处于折叠状态时,所述柔性屏幕被划分为第一屏和第二屏,所述方法包括:
    当所述第一屏的显示界面上显示有可选控件时,所述电子设备检测到用于触发删除控 件的双面屏手势操作,所述双面屏手势操作包括对所述第一屏的第一操作和对所述第二屏的第二操作,所述第一操作和所述第二操作的操作时间差小于预设时间差,所述第一操作的触摸位置与所述第二操作的触摸位置对应,所述第一操作对应所述第一屏上的第一可选控件,所述第一操作和所述第二操作的操作类型均为滑动操作且滑动方向一致;
    响应于所述双面屏手势操作,所述电子设备将所述第一屏上的所述第一可选控件删除。
  21. 根据权利要求20所述的方法,其特征在于,响应于所述双面屏手势操作,所述电子设备将所述第一屏上的所述第一可选控件删除,包括:
    响应于所述双面屏手势操作,若所述第一屏为面向用户的屏幕,所述电子设备将所述第一屏上的所述第一可选控件删除。
  22. 根据权利要求20所述的方法,其特征在于,当所述第一屏的显示界面上显示有可选控件,且所述第二屏的显示界面上显示有可选控件时,若所述第一操作对应所述第一屏上的第一可选控件,且所述第二操作对应所述第二屏上的第二可选控件;
    响应于所述双面屏手势操作,若所述第一屏面向用户,所述电子设备将所述第一屏上的所述第一可选控件删除;若所述第二屏面向用户,所述电子设备将所述第二屏上的所述第二可选控件删除。
  23. 根据权利要求20-22中任一项所述的方法,其特征在于,所述响应于所述双面屏手势操作,所述电子设备将所述第一屏上的所述第一可选控件删除之后,还包括:
    若所述第一可选控件的下方有一个或多个第三可选控件,所述电子设备将所述一个或多个第三可选控件顺序上移。
  24. 一种电子设备,其特征在于,包括:
    柔性屏幕,所述柔性屏幕包括显示器和触控器件,所述柔性屏幕的物理形态包括展开状态和折叠状态,当所述柔性屏幕处于折叠状态时,所述柔性屏幕被划分为主屏和副屏;
    一个或多个处理器;
    一个或多个存储器;
    一个或多个传感器;
    所述存储器存储有一个或多个应用程序以及一个或多个程序,其中所述一个或多个程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-23中任一项所述的控制方法。
  25. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-23中任一项所述的控制方法。
  26. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-23中任一项所述的控制方法。
  27. 一种程序产品,其特征在于,所述程序产品包括计算机程序,所述计算机程序存储在可读存储介质中,通信装置的至少一个处理器可以从所述可读存储介质读取所述计算机程序,所述至少一个处理器执行所述计算机程序使得通信装置实施如权利要求1-23任意一项所述的方法。
PCT/CN2020/116714 2019-09-29 2020-09-22 具有柔性屏幕的电子设备的控制方法及电子设备 WO2021057699A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910935961.4A CN112578981A (zh) 2019-09-29 2019-09-29 具有柔性屏幕的电子设备的控制方法及电子设备
CN201910935961.4 2019-09-29

Publications (1)

Publication Number Publication Date
WO2021057699A1 true WO2021057699A1 (zh) 2021-04-01

Family

ID=75111130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116714 WO2021057699A1 (zh) 2019-09-29 2020-09-22 具有柔性屏幕的电子设备的控制方法及电子设备

Country Status (2)

Country Link
CN (1) CN112578981A (zh)
WO (1) WO2021057699A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047870A (zh) * 2021-11-15 2022-02-15 珠海读书郎软件科技有限公司 一种双屏手表的截屏方法、存储介质及设备
CN114281210A (zh) * 2021-12-21 2022-04-05 维沃移动通信有限公司 电子设备的控制方法、装置及设备
CN114756165A (zh) * 2022-04-24 2022-07-15 维沃移动通信有限公司 设备控制方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2752751A2 (en) * 2013-01-04 2014-07-09 LG Electronics, Inc. Method for controlling a terminal using a double touch gesture and terminal thereof
CN109358793A (zh) * 2018-09-27 2019-02-19 维沃移动通信有限公司 一种截屏方法及移动终端
CN109542325A (zh) * 2018-11-29 2019-03-29 努比亚技术有限公司 双面屏触控方法、双面屏终端、可读存储介质
CN109710168A (zh) * 2018-12-27 2019-05-03 努比亚技术有限公司 一种屏幕触控方法、设备及计算机可读存储介质
CN110262690A (zh) * 2019-06-18 2019-09-20 Oppo广东移动通信有限公司 双屏显示方法和装置、移动终端、计算机可读存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076986B (zh) * 2014-07-25 2015-12-09 上海逗屋网络科技有限公司 一种用于多点触摸终端的触摸控制方法与设备
CN109981839B9 (zh) * 2019-02-02 2021-08-31 华为技术有限公司 一种具有柔性屏幕的电子设备的显示方法及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2752751A2 (en) * 2013-01-04 2014-07-09 LG Electronics, Inc. Method for controlling a terminal using a double touch gesture and terminal thereof
CN109358793A (zh) * 2018-09-27 2019-02-19 维沃移动通信有限公司 一种截屏方法及移动终端
CN109542325A (zh) * 2018-11-29 2019-03-29 努比亚技术有限公司 双面屏触控方法、双面屏终端、可读存储介质
CN109710168A (zh) * 2018-12-27 2019-05-03 努比亚技术有限公司 一种屏幕触控方法、设备及计算机可读存储介质
CN110262690A (zh) * 2019-06-18 2019-09-20 Oppo广东移动通信有限公司 双屏显示方法和装置、移动终端、计算机可读存储介质

Also Published As

Publication number Publication date
CN112578981A (zh) 2021-03-30

Similar Documents

Publication Publication Date Title
WO2020156269A1 (zh) 一种具有柔性屏幕的电子设备的显示方法及电子设备
WO2021018067A1 (zh) 一种悬浮窗口的管理方法及相关装置
WO2021013158A1 (zh) 显示方法及相关装置
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
WO2021103981A1 (zh) 分屏显示的处理方法、装置及电子设备
WO2020224449A1 (zh) 一种分屏显示的操作方法及电子设备
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
WO2021063237A1 (zh) 电子设备的控制方法及电子设备
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
WO2021057343A1 (zh) 一种对电子设备的操作方法及电子设备
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
WO2022068483A1 (zh) 应用启动方法、装置和电子设备
WO2021082564A1 (zh) 一种操作提示的方法和电子设备
WO2021078032A1 (zh) 用户界面的显示方法及电子设备
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
WO2021042878A1 (zh) 一种拍摄方法及电子设备
WO2020238759A1 (zh) 一种界面显示方法和电子设备
WO2020221062A1 (zh) 一种导航操作方法及电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021082911A1 (zh) 一种内容传输方法和终端设备
WO2022002213A1 (zh) 翻译结果显示方法、装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20869507

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20869507

Country of ref document: EP

Kind code of ref document: A1