CN112578981A - Control method of electronic equipment with flexible screen and electronic equipment - Google Patents

Control method of electronic equipment with flexible screen and electronic equipment Download PDF

Info

Publication number
CN112578981A
CN112578981A CN201910935961.4A CN201910935961A CN112578981A CN 112578981 A CN112578981 A CN 112578981A CN 201910935961 A CN201910935961 A CN 201910935961A CN 112578981 A CN112578981 A CN 112578981A
Authority
CN
China
Prior art keywords
screen
control
double
state
flexible
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910935961.4A
Other languages
Chinese (zh)
Inventor
梁树为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910935961.4A priority Critical patent/CN112578981A/en
Priority to PCT/CN2020/116714 priority patent/WO2021057699A1/en
Publication of CN112578981A publication Critical patent/CN112578981A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a control method of electronic equipment with a flexible screen and the electronic equipment, which can determine whether to trigger responses of screen capture, control multi-selection, icon movement and control deletion according to the physical form and display state of the flexible screen and touch operation of a user on a double-sided screen, thereby improving the human-computer interaction efficiency of the electronic equipment and providing better use experience for the user. The physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a main screen and a sub-screen, the method including: when the flexible screen is in a bright screen state, the electronic equipment detects a first operation of a user on the main screen and a second operation of the user on the auxiliary screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, the touch positions of the first operation and the second operation correspond to each other, and the electronic equipment responds to the operations on the main screen and the auxiliary screen and conducts screen capture on the flexible screen in the bright screen state.

Description

Control method of electronic equipment with flexible screen and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a control method of electronic equipment with a flexible screen and the electronic equipment.
Background
With the continuous development of electronic devices, electronic devices (such as mobile phones, tablet computers, etc.) with display screens have become an indispensable part of people's daily life and work. In order to provide users with richer display information, the display screens of electronic devices are becoming larger and larger, and the portability of the electronic devices is affected by the oversize of the screens. Therefore, electronic devices having flexible screens are becoming the development direction of future electronic devices.
The flexible screen may also be called as a flexible OLED (organic light-emitting diode), and compared to a conventional screen, the flexible screen is not only thinner and lighter in volume, but also has a higher durability than the conventional screen due to its characteristics of being bendable and having good flexibility. Currently, some equipment manufacturers have applied a flexible screen to electronic equipment such as a mobile phone and a tablet computer, and a user may trigger a screenshot of a screen in a bright screen state (a single screen or an entire flexible screen) by operating a function combination key (for example, a combination of a volume key and a switch key) or a system level button of the mobile phone when the flexible screen is in a folded state or an unfolded state. However, the function combination key usually needs two hands to operate, and has a high requirement on the press opportunity, the system-level button needs more steps to trigger the screenshot, and the operation steps are more complicated. Therefore, the screen capture mode of the existing foldable mobile phone is not convenient enough, and the human-computer interaction efficiency is low. Similar problems exist for control multi-selection, icon movement, control deletion and other operations.
Disclosure of Invention
The application provides a control method of electronic equipment with a flexible screen and the electronic equipment, wherein the electronic equipment can respond to double-sided screen gesture operation to perform screen capture, control multi-selection, icon movement and control deletion, and the man-machine interaction efficiency is improved.
In a first aspect, the present application provides a control method for an electronic device having a flexible screen, where a physical form of the flexible screen may include an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, and specifically, the control method includes: when the flexible screen is in a bright screen state, the electronic equipment detects double-sided screen gesture operation for triggering screen capture, the double-sided screen gesture operation comprises first operation on a first screen and second operation on a second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation; and responding to the gesture operation of the double-sided screen, and carrying out screen capture on the flexible screen in a bright screen state by the electronic equipment.
For example, the electronic device may calculate an angle between the first screen and the second screen according to data detected by the acceleration sensor and the gyroscope, and then determine that the flexible screen is in the unfolded state or the folded state according to the angle. Of course, the electronic device may also determine the current physical form of the flexible screen in other manners, for example, the electronic device may use a distance sensor to detect a distance between the main screen and the sub-screen, and then determine the current physical form of the flexible screen according to the distance, which is not limited in this application.
It should be understood that the operation time at which the user triggers an operation on the screen (the first screen or the second screen) refers to the starting time at which the user's finger or palm touches the screen. Accordingly, the operation time difference between the first operation and the second operation is a time difference between a start time when the finger or palm of the user touches the first screen and a start time when the finger or palm of the user touches the second screen. For example, the preset time difference may be set to 500ms or less. The touch position of the first operation corresponds to the touch position of the second operation, which can be understood as that the position difference between the touch positions of the first operation and the second operation is smaller than a preset position difference, and because the touch positions of the first operation and the second operation are on different screens, the mobile phone needs to convert touch points of the first operation and the second operation on the screens into the same screen coordinate system through screen coordinate system conversion, for example, the first screen coordinate system, the second screen coordinate system or other coordinate systems different from the first screen and the second screen. For example, the preset position difference may be set to 10 dp.
It should be noted that the first screen and the second screen are only used for distinguishing the display area of the flexible screen of the electronic device, and do not represent the importance or primary and secondary of the screen.
In the above scheme, when the screen capture of the flexible screen is required, if the current physical form of the flexible screen is a folded state, the electronic device can perform the screen capture of the corresponding screen according to the double-sided screen gesture operation of the user on the flexible screen and the current display state (bright screen state or black screen state) of the flexible screen. Therefore, a user does not need to call a notification bar or an operation control (such as a virtual button) on a flexible screen of the electronic equipment, and then click a corresponding screenshot control to capture the screen content. The double-sided screen gesture operation is more convenient and faster, and better use experience is provided for users.
In one possible implementation manner, in response to a double-sided screen gesture operation, the electronic device performs screenshot on a flexible screen in a bright screen state, and the screenshot includes: if the first screen is in a bright screen state and the second screen is in a black screen state, responding to the gesture operation of the double-sided screen, and carrying out screen capture on the first screen by the electronic equipment; if the first screen is in a black screen state and the second screen is in a bright screen state, responding to the gesture operation of the double-sided screen, and carrying out screen capture on the second screen by the electronic equipment; and if the first screen and the second screen are both in a bright screen state, responding to the gesture operation of the double-sided screen, and carrying out screen capture on the whole flexible screen by the electronic equipment. For example, the electronic device may obtain display parameters of the first screen and the second screen through the display driver, so as to determine a display state of the flexible screen before the user triggers the double-sided screen gesture operation, for example, the first screen is in a bright screen state, and the second screen is in a black screen state. According to the implementation mode, the electronic equipment can perform screen capture on one screen or the whole screen in the bright screen state according to the screen display state of the user before the double-sided screen gesture operation is triggered.
In one possible implementation manner, after the electronic device performs screenshot on the flexible screen in the bright screen state, the method further includes: and displaying a screen capture preview interface. For example, the screenshot preview interface may include at least one of the following processing controls: save controls, edit controls, share controls, or cancel controls.
In one possible implementation, a screenshot preview interface is displayed, comprising: if the first screen is in a bright screen state and the second screen is in a black screen state, displaying a screen capture preview interface on the first screen; if the first screen is in a black screen state, the second screen is in a bright screen state, and a screen capture preview interface is displayed on the second screen; and if the first screen and the second screen are both in a bright screen state, displaying a screen capture preview interface on a screen facing the user, wherein the screen facing the user is the first screen or the second screen. According to the implementation mode, after the electronic equipment finishes screenshot, the screenshot preview interface can be displayed on the screen in the bright screen state, if the two screens of the flexible screen are both in the bright screen state, the electronic equipment needs to determine the screen facing the user, and the screenshot preview interface is displayed on the screen facing the user. For example, the electronic device determines the screen facing the user through detection data reported by at least one of an infrared sensor, a camera, a proximity light sensor, or a touch device of the flexible screen.
In one possible implementation manner, after the screenshot preview interface is displayed, the method further includes: the electronic equipment detects a third operation for triggering selection of one processing control on the screen capture preview interface; and responding to the third operation, and executing the execution processing corresponding to the selected processing control on the screenshot by the electronic equipment.
It can be seen that after the electronic device completes the screenshot, a screen preview interface is displayed on the screen, and the user can click on a corresponding processing control on the screenshot preview interface to save, edit, share, cancel, and the like the screenshot, for example, the user clicks on an editing control on the screenshot preview interface, and can add characters, a map, or a filter, and the like on the screenshot, and for example, the user clicks on a sharing control on the screenshot preview interface, and can send the screenshot to friends or family through an application program.
In one possible implementation, the first operation and the second operation are the same operation type, and the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation and long-pressing operation. Among them, the pressing operation is also called a pressure-sensitive operation, and the following conditions should be satisfied: the operation duration of the touch operation of the user on the screen is greater than or equal to the preset duration, the coordinate position of the touch point of the touch operation is not changed, and the pressing value of the touch operation is greater than or equal to the preset pressure value. The click operation should satisfy: the operation duration of the touch operation of the user on the screen is less than the preset duration. The double-click operation should satisfy: the user triggers two times of clicking operations on the screen, and the operation time interval of the two times of clicking operations is smaller than the preset time interval. The long press operation should satisfy the following conditions: the operation duration of the touch operation of the user on the screen is greater than or equal to the preset duration, and the pressing value of the touch operation is smaller than the preset pressure value.
In a second aspect, the present application provides a method for controlling an electronic device having a flexible screen, where the flexible screen has a physical form including an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, and specifically, the method includes: when at least two selectable controls are displayed on a display interface of a first screen, the electronic equipment detects a double-sided screen gesture operation for triggering multi-selection, the double-sided screen gesture operation comprises a first operation on the first screen and a second operation on a second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation; in response to the double-sided screen gesture operation, the electronic device controls the display interface of the first screen to enter a multi-selection mode, wherein in the multi-selection mode, at least two selectable controls on the display interface of the first screen can be selected.
It should be noted that the selectable control refers to a control that can be selected and processed on the display interface of the screen. For example, the selectable control may be a picture control on a picture browsing interface in an album, and the user may select multiple picture controls at the same time to splice multiple pictures (i.e., edit pictures), or send multiple pictures to friends in batch (i.e., share pictures), or add multiple pictures to a new album folder (i.e., sort pictures). The multi-selection mode is an operation mode in which a user can select at least two selectable controls on a display interface of a screen and perform batch processing.
In the above scheme, when at least two selectable controls on the display interface of the flexible screen need to be processed in batch, if the current physical form of the flexible screen is a folded state, the electronic device can control the display interface of the corresponding screen to enter the multi-selection mode quickly according to the double-sided screen gesture operation of the user on the flexible screen. In this way, the user need not enter the multiple selection mode by clicking on a "select" control at a specified location (e.g., the upper left corner of the screen). The double-sided screen gesture operation is more convenient and faster, and better use experience is provided for users.
In a possible implementation manner, when the flexible screen is in a folded state, at least two selectable controls are displayed on the display interface of the first screen, the second screen is in a black screen state, or the second screen is in a bright screen state but no selectable control is displayed on the display interface of the second screen, the electronic device detects a double-sided screen gesture operation for triggering multiple selection, and in response to the double-sided screen gesture operation, the electronic device may control the display interface of the first screen to enter a multiple selection mode.
In one possible implementation manner, in response to a double-sided screen gesture operation, the electronic device controls a display interface of a first screen to enter a multiple-selection mode, and the method includes: responding to the gesture operation of the double-sided screen, and if the first screen faces to the user, controlling the display interface of the first screen to enter a multi-selection mode by the electronic equipment. According to the implementation mode, the electronic equipment also needs to detect the specific orientation of the current electronic equipment, and further determines whether to control the display interface of the first screen to enter the multi-selection mode according to the specific orientation of the electronic equipment. That is, if the screen to be controlled of the electronic device faces away from the user, the electronic device does not trigger the corresponding control, thereby avoiding the misoperation of the user.
In a possible implementation manner, when at least two selectable controls are displayed on the display interface of the first screen and at least two selectable controls are displayed on the display interface of the second screen, in response to a double-sided screen gesture operation, the electronic device controls the display interface of the user-oriented screen to enter a multi-selection mode, and the user-oriented screen is the first screen or the second screen.
According to the implementation mode, the display interfaces of two screens of the electronic equipment are provided with selectable controls, the electronic equipment determines which display interface of the screen enters the multi-selection mode according to the specific orientation of the current electronic equipment, if the screen facing the user is the first screen, the electronic equipment controls the display interface of the first screen to enter the multi-selection mode, and if the screen facing the user is the second screen, the electronic equipment controls the display interface of the second screen to enter the multi-selection mode.
In one possible implementation manner, in response to a double-sided screen gesture operation, the electronic device controls a display interface of a first screen to enter a multiple-selection mode, and the method includes: and if the touch position of the first operation on the first screen corresponds to the first selectable control, the first selectable control is one selectable control displayed on the first screen, the electronic equipment controls the display interface of the first screen to enter a multi-selection mode in response to the gesture operation of the double-sided screen, and displays the first selectable control as a selected one. Through the implementation mode, a user can directly select one selectable control on the first screen while triggering the display interface of the first screen to enter the multi-selection mode.
In one possible implementation manner, after the electronic device controls the display interface of the first screen to enter the multiple selection mode in response to the double-sided screen gesture operation, the method further includes: the electronic equipment detects a third operation for triggering selection of a selectable control on a display interface of the first screen, wherein the operation type of the third operation is click operation or sliding operation; in response to the third operation, the electronic device displays one or more controls corresponding to the third operation as selected. The user can add one or more selectable controls to complete the multi-selection operation through the implementation mode.
In one possible implementation, the first operation and the second operation are the same operation type, and the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
In a third aspect, the present application provides a method for controlling an electronic device having a flexible screen, where a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, and specifically, the method includes: when the display interfaces of the first screen and the second screen are interfaces capable of containing icon controls and the icon controls are displayed on the display interface of the first screen, the electronic equipment detects double-sided screen gesture operation for triggering icon movement, the double-sided screen gesture operation comprises first operation on the first screen and second operation on the second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to the first icon control on the first screen; in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to the second screen.
It should be noted that the icon control is an icon control that can be moved, and the icon controls on the display interface may be arranged according to forms such as a nine-square grid, a sixteen-square grid, and the like. Specifically, the icon control may be an icon control of an application program on a main interface of the electronic device, or may also be an icon control in a certain application, which is not limited in this application.
In the above scheme, when the icon control on the display interface of the flexible screen needs to be moved across the screen, if the current physical form of the flexible screen is a folded state, the icon control can be accommodated in the double-screen display interface, and the electronic device can control the icon control corresponding to the operation to move across the screen according to the double-side screen gesture operation of the user on the flexible screen. In this way, the user can move the icon control corresponding to the operation to another screen without unfolding the electronic device. The double-sided screen gesture operation is more convenient and faster, and better use experience is provided for users.
In a possible implementation manner, when the flexible screen is in a folded state, an icon control is displayed on a display interface of the first screen, and a display interface of the second screen is an empty screen (i.e., no icon control exists), the electronic device detects a double-sided screen gesture operation for triggering the icon to move, and in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to the second screen.
In one possible implementation, in response to a double-sided screen gesture operation, the electronic device moving a first icon control from a first screen to a second screen includes: responding to the gesture operation of the double-sided screen, and if the first screen faces to the user, the electronic equipment moves the first icon control from the first screen to the second screen.
In this implementation, the electronic device further needs to detect a specific orientation of the current electronic device, and further determines whether to move the first icon control according to the specific orientation of the electronic device. That is to say, if the electronic device determines that the screen where the first icon control to be moved is located faces away from the user, the electronic device does not trigger the icon to move, so that misoperation of the user is avoided.
In a possible implementation manner, when the display interfaces of the first screen and the second screen are both interfaces capable of accommodating icon controls, at least one icon control is displayed on the display interface of the first screen, and at least one icon control is displayed on the display interface of the second screen, if the first operation corresponds to the first icon control on the display interface of the first screen and the second operation corresponds to the second icon control on the display interface of the second screen, responding to a gesture operation of the double-sided screen, and if the first screen faces a user, the electronic device moves the first icon control from the first screen to the second screen; if the second screen is to the user, the electronic device moves the second icon control from the second screen to the first screen.
According to the implementation mode, the display interfaces of the two screens of the electronic equipment are provided with icon controls, touch control operation on the two screens corresponds to the icon controls, the electronic equipment can further determine which icon control on the screen should be moved according to the specific orientation of the current electronic equipment, if the screen facing the user is the first screen, the electronic equipment controls the icon control on the first screen to move to the second screen, and if the screen facing the user is the second screen, the electronic equipment controls the icon control on the second screen to move to the first screen.
In one possible implementation manner, when the display interfaces of the first screen and the second screen are both interfaces capable of accommodating icon controls, at least one icon control is displayed on the display interface of the first screen, and at least one icon control is displayed on the display interface of the second screen, in response to a double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to the front of the at least one icon control on the second screen, and sequentially moves the at least one icon control on the second screen backwards; alternatively, in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to behind the at least one icon control on the second screen.
This implementation defines the position at which the first icon control moves to the second screen, and if there are multiple icon controls originally on the second screen, the first icon control can be moved to the foremost or rearmost of the multiple icon controls. For example, two icon controls a and B originally arranged in sequence are arranged on the second screen, a is located in front of B, the electronic device can move the icon control C on the first screen to the position of a, and a and B move backwards in sequence, or C moves to the back of B, and the positions of a and B on the second screen are not changed.
In one possible implementation, the electronic device moving the first icon control from the first screen to the second screen includes: and the electronic equipment moves the first icon control from the first screen to the touch position of the second operation of the second screen.
It should be noted that there may be no icon control at the touch position of the second operation of the second screen, and there may also be an icon control. If the icon control does not exist, the electronic device can directly move the first icon control to the touch position of the second operation of the second screen. If there is an icon control, see the scheme below.
In a possible implementation manner, if there is a third icon control at the touch position of the second screen, the moving, by the electronic device, the first icon control from the first screen to the touch position of the second operation of the second screen includes: the electronic equipment moves the first icon control from the first screen to a touch position of a second operation of the second screen, and moves a third icon control on the second screen backwards; or the electronic equipment moves the first icon control from the first screen to the touch position of the second operation of the second screen, and merges the third icon control and the first icon control on the second screen. That is to say, if there is an icon control at the touch position of the second operation of the second screen, the moved first icon control is still placed at the touch position, and the electronic device needs to move all the icon controls behind the touch position backward in sequence, or the moved first icon control and the icon control at the touch position are directly merged into the same folder.
In one possible implementation, the first operation and the second operation are the same operation type, and the operation type includes any one of the following: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
In a fourth aspect, the present application provides a control method for an electronic device having a flexible screen, where a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, and specifically, the control method includes: when a selectable control is displayed on a display interface of a first screen, the electronic equipment detects a double-sided screen gesture operation for triggering deletion of the control, the double-sided screen gesture operation comprises a first operation on the first screen and a second operation on a second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, the first operation corresponds to the first selectable control on the first screen, and the operation types of the first operation and the second operation are sliding operations and the sliding directions are consistent; in response to the double-sided screen gesture operation, the electronic device deletes the first selectable control on the first screen.
It should be noted that the selectable control refers to a control that can be deleted on a display interface of the screen, for example, the selectable control may be a session box control in a chat list interface, or a folder control in a file list interface, or an application icon control in a main interface of the mobile phone, and the like. The sliding direction may be a horizontal direction, a vertical direction, or other directions having a certain included angle with the horizontal direction or the vertical direction, which is not limited in this application.
In the above scheme, when the selectable control on the display interface of the flexible screen needs to be deleted, if the current physical form of the flexible screen is the folding state, the electronic device may delete the selectable control corresponding to the operation according to the double-sided screen gesture operation of the user on the flexible screen. Therefore, the user can slide the selectable control to the edge of the screen or a designated area on the screen by using a double-side screen gesture, so that the selectable control can be quickly deleted, and the mistaken deletion operation of the control by the single-screen sliding of the user can be avoided.
In a possible implementation manner, when the flexible screen is in a folded state, the selectable control is displayed on the display interface of the first screen, the second screen is in a black screen state, or the second screen is in a bright screen state but the display interface of the second screen does not have the selectable control, the electronic device detects a double-sided screen gesture operation for triggering the deletion control, and in response to the double-sided screen gesture operation, the electronic device deletes the first selectable control on the first screen.
In one possible implementation manner, in response to a double-sided screen gesture operation, the electronic device deletes the first selectable control on the first screen, and the method includes: responding to the gesture operation of the double-sided screen, and if the first screen faces to the user, deleting the first selectable control on the first screen by the electronic equipment. In this implementation, the electronic device further needs to detect a specific orientation of the current electronic device, and further determines whether to delete the first selectable control on the first screen according to the specific orientation of the electronic device. That is to say, if the electronic device determines that the screen where the first selectable control to be deleted is located faces away from the user, the electronic device does not trigger the deletion control, so that misoperation of the user is avoided.
In a possible implementation manner, when a selectable control is displayed on a display interface of a first screen and a selectable control is displayed on a display interface of a second screen, if a first operation corresponds to the first selectable control on the first screen and a second operation corresponds to the second selectable control on the second screen; responding to the gesture operation of the double-sided screen, and if the first screen faces the user, deleting the first selectable control on the first screen by the electronic equipment; if the second screen is presented to the user, the electronic device deletes the second selectable control on the second screen.
According to the implementation mode, the display interfaces of the two screens of the electronic equipment are provided with the selectable controls, the touch operation on the two screens corresponds to the selectable controls, the electronic equipment can further determine which screen of the selectable controls on the screen should be deleted according to the specific orientation of the current electronic equipment, if the screen facing the user is the first screen, the electronic equipment deletes the selectable controls on the first screen, and if the screen facing the user is the second screen, the electronic equipment deletes the selectable controls on the second screen.
In one possible implementation manner, in response to the double-sided screen gesture operation, after the electronic device deletes the first selectable control on the first screen, the method further includes: if one or more third selectable controls are arranged below the first selectable control, the electronic equipment moves the one or more third selectable controls upwards in sequence.
In a fifth aspect, the present application provides an electronic device, comprising: a flexible screen, one or more processors, one or more memories, one or more sensors; the flexible screen comprises a display and a touch device, the physical form of the flexible screen comprises an unfolding state and a folding state, and when the flexible screen is in the folding state, the flexible screen is divided into a main screen and an auxiliary screen; the memory stores one or more application programs and one or more programs, wherein the one or more programs include instructions that, when executed by the electronic device, cause the electronic device to perform any of the control methods described above.
In a sixth aspect, the present application provides a computer-readable storage medium having instructions stored therein, which when executed on an electronic device, cause the electronic device to perform any one of the control methods described above.
In a seventh aspect, the present application provides a computer program product comprising instructions for causing an electronic device to perform any of the control methods described above when the computer program product is run on the electronic device.
It is to be understood that the electronic device according to the fifth aspect, the computer-readable storage medium according to the sixth aspect, and the computer program product according to the seventh aspect are all configured to execute the corresponding control method provided above, and therefore, the beneficial effects achieved by the electronic device can refer to the beneficial effects in the corresponding control method provided above, and are not described herein again.
Drawings
Fig. 1 is a first schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3a is a first schematic diagram illustrating an architecture of an operating system in an electronic device according to an embodiment of the present disclosure;
fig. 3b is a schematic diagram illustrating an architecture of an operating system in an electronic device according to an embodiment of the present application;
fig. 4 is a schematic view illustrating a holding state of a user when the user uses a double-sided screen gesture according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a first flowchart illustrating a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 7 is a first scene schematic diagram of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 8 is a second scene schematic diagram of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 9 is a second flowchart illustrating a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 10 is a third scene schematic diagram of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 11 is a fourth scene schematic diagram of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 12 is a third flowchart illustrating a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 13a is a scene schematic diagram five of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 13b is a scene schematic diagram six of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 13c is a scene schematic diagram seven of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 14 is a fourth flowchart illustrating a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 15a is a scene schematic diagram eight of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 15b is a scene schematic diagram nine of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 16 is a scene schematic diagram ten of a control method of an electronic device with a flexible screen according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
The control method for the electronic device with the flexible screen provided in the embodiment of the present application may be applied to an electronic device with a flexible screen, such as a mobile phone, a tablet computer, a notebook computer, a super-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable device, and a virtual reality device, and the embodiment of the present application does not limit the electronic device with the flexible screen.
Taking the mobile phone 100 as an example of the above electronic device, fig. 1 shows a schematic structural diagram of the mobile phone.
The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a radio frequency module 150, a communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a flexible screen 301, a Subscriber Identity Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the cell phone 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the mobile phone 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the communication module 160. For example: the processor 110 communicates with a bluetooth module in the communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the communication module 160 through the UART interface, so as to realize the function of playing music through the bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as flexible screen 301, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the camera function of the handset 100. The processor 110 and the flexible screen 301 communicate through the DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the flexible screen 301, the communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the flexible screen 301, the camera 193, the communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the rf module 150, the communication module 160, the modem processor, and the baseband processor.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The rf module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the mobile phone 100. The rf module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The rf module 150 may receive the electromagnetic wave from the antenna 1, and filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The rf module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the rf module 150 may be disposed in the processor 110. In some embodiments, at least some functional modules of the rf module 150 may be disposed in the same device as at least some modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the flexible screen 301. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 110 and may be disposed in the same device as the rf module 150 or other functional modules.
The communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The communication module 160 may be one or more devices integrating at least one communication processing module. The communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The communication module 160 may also receive a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it.
In some embodiments, the antenna 1 of the handset 100 is coupled to the radio frequency module 150 and the antenna 2 is coupled to the communication module 160 so that the handset 100 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 realizes the display function through the GPU, the flexible screen 301, and the application processor. The GPU is a microprocessor for image processing, connecting the flexible screen 301 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information. In the present embodiment, the flexible screen 301 may include a display and a touch device therein. The display is used to output display content to a user and the touch device is used to receive touch events input by the user on the flexible screen 301. It should be understood that the touch device referred to herein may be integrated in a flexible screen with a display, the touch device remaining connected to the display; the touch control device may also be provided separately and remain connected to the display of the flexible screen.
In the present embodiment, the physical form of the flexible screen 301 may be divided into a folded state, a stand state, and an unfolded state. In other embodiments, the physical form of the flexible screen 301 may be divided into only a folded state and an unfolded state.
As shown in fig. 2 (a), the flexible screen 301 may be displayed as a complete display area in the unfolded state, and a user may fold the screen along one or more folding lines in the flexible screen 301. The position of the folding line may be preset, or may be arbitrarily selected by the user in the flexible screen 301.
As shown in fig. 2 (b), after the user folds the flexible screen 301 along a folding line AB in the flexible screen 301, the flexible screen 301 may be divided into two display areas along the folding line AB. In this embodiment of the application, the two folded display areas may be displayed as two independent display areas. For example, the display area on the right side of the folding line AB may be referred to as the main screen 11 of the cellular phone 100, and the display area on the left side of the folding line AB may be referred to as the sub-screen 12 of the cellular phone 100. The display areas of the main screen 11 and the sub screen 12 may be the same or different. It should be noted that the main screen and the sub-screen are only used for distinguishing the display areas on two sides, and do not represent the importance or primary and secondary of the screen; the main screen and the sub-screen may also be referred to as a first screen and a second screen, respectively, which is not limited in this embodiment of the present invention.
When the user folds the flexible screen 301, an angle is formed between the divided main screen 11 and the sub-screen 12. In the embodiment of the present application, the mobile phone 100 may calculate the angle between the main screen and the sub-screen through data detected by one or more sensors (e.g., a gyroscope and an acceleration sensor). It will be appreciated that the angle β between the primary screen 11 and the secondary screen 12 is within the closed range of 0 to 180 °. When the included angle β between the main screen 11 and the sub-screen 12 is greater than a first threshold value (for example, β is 170 °), the cellular phone 100 may determine that the flexible screen 301 is in the unfolded state, as shown in fig. 2 (a). When the included angle β between the main screen 11 and the sub-screen 12 is within a preset interval (e.g., β is between 40 ° and 60 °), the cellular phone 100 may determine that the flexible screen 301 is in the stand state, as shown in fig. 2 (b). Still alternatively, when the included angle β between the main screen 11 and the sub-screen 12 is smaller than a second threshold value (for example, β is 20 °), the cellular phone 100 may determine that the flexible screen 301 is in the folded state, as in (c) of fig. 2. In other embodiments where the screen states are divided into two states, the mobile phone 100 may determine that the flexible screen 301 is in the unfolded state when the included angle β between the main screen 11 and the sub-screen 12 is greater than a third threshold (e.g., 45 ° or 60 °); when the angle β between the main screen 11 and the sub-screen 12 is smaller than the third threshold, the cellular phone 100 may determine that the flexible screen 301 is in the folded state.
It should be noted that, after the user folds the flexible screen 301 along the folding line AB, the main screen and the sub-screen may be arranged opposite to each other, or the main screen and the sub-screen may also be deviated from each other. As shown in fig. 2 (c), after the user folds the flexible screen 301, the main screen and the sub-screen are separated from each other, and both the main screen and the sub-screen are exposed to the external environment, so that the user can use the main screen to display or the sub-screen to display.
In some embodiments, after the user folds the flexible screen 301, the screen of the bent portion (which may also be referred to as a side screen) may also serve as an independent display area, and in this case, the flexible screen 301 is divided into three independent display areas, namely a main screen, a sub screen and a side screen, as shown in fig. 2 (c).
In this embodiment, when the flexible screen 301 of the mobile phone 100 is in a bright screen state, for example, the main screen 11 is in a bright screen state, and the sub-screen 12 is in a black screen (or screen-off) state, or the main screen 11 is in a black screen state and the sub-screen 12 is in a bright screen state, or both the main screen 11 and the sub-screen 12 are in bright screen states, the mobile phone 100 may determine whether to trigger a screenshot, a control multi-selection, an icon movement, a control deletion response, and the like according to a physical form and a display state of the flexible screen 301 and touch operations of a user on the main screen 11 and the sub-screen 12.
The sensor module 180 may include one or more of a gyroscope, an acceleration sensor, a pressure sensor, an air pressure sensor, a magnetic sensor (e.g., a hall sensor), a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, a pyroelectric infrared sensor, an ambient light sensor, or a bone conduction sensor, which is not limited in this embodiment.
The cell phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the flexible screen 301, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the handset 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the flexible screen 301. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
The software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application exemplifies a software structure of the mobile phone 100 by taking an Android system with a layered architecture as an example.
Fig. 3a is a block diagram of the software structure of the mobile phone 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 3a, applications such as camera, gallery, calendar, call, map, navigation, bluetooth, music, video, short message, etc. may be installed in the application layer.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 3a, the application framework layer may include an Input Manager Service (IMS). Certainly, the application framework layer may further include a display policy service, a Power Manager Service (PMS), a Display Manager Service (DMS), an activity manager, a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like, which are not limited in this embodiment.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: a condition monitoring service, a surface manager (surface manager), a Media library (Media Libraries), a three-dimensional graphics processing library (e.g., OpenGL ES), a 2D graphics engine (e.g., SGL), and the like. The state monitoring service is used for determining the specific orientation of the mobile phone, the physical state of the flexible screen and the like according to the monitoring data reported by the kernel layer. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a sensor driver, a TP driver, a camera driver, an audio driver and the like, and the embodiment of the application does not limit the display driver, the sensor driver, the TP driver, the camera driver, the audio driver and the like.
As also shown in fig. 3a, the system library and the kernel layer, etc. below the application framework layer may be referred to as an underlying system, which includes an underlying display system for providing display services, e.g., the underlying display system includes a display driver in the kernel layer and a surface manager in the system library, etc. In addition, the bottom layer system in the application further comprises a state monitoring service for identifying the physical form change of the flexible screen, and the state monitoring service can be independently arranged in the bottom layer display system and can also be arranged in a system library and/or a kernel layer.
For example, the status monitoring service may invoke a sensor service (sensor service) to initiate a sensor such as a gyroscope or an acceleration sensor to detect. The state monitoring service can calculate the included angle between the current main screen and the auxiliary screen according to the detection data reported by each sensor. Therefore, the state monitoring service can determine that the flexible screen is in a physical state such as an unfolding state, a folding state or a bracket state through an included angle between the main screen and the auxiliary screen. And, the state monitoring service may report the determined physical form to the input management service.
In some embodiments, when the state monitoring service determines that the current mobile phone is in the folded state or the cradle state, the state monitoring service may further activate a sensor such as a camera, an infrared sensor, a proximity light sensor, or a Touch Panel (TP) to identify a specific orientation of the mobile phone. For example, the particular orientation of the handset may include the primary screen facing the user, the secondary screen facing the user, or the like. It should be understood that references herein to the primary or secondary screen or the entire flexible screen facing the user include: the main screen or the auxiliary screen or the whole flexible screen faces the user at a basically parallel angle with the face of the user, and the main screen or the auxiliary screen or the whole flexible screen faces the user at a certain inclined angle.
For example, the input management service can be used to obtain the physical form, display state, screen touch data, specific orientation of the mobile phone, etc. of the current mobile phone flexible screen from the underlying system when a touch event on the screen is received. The screen touch data comprises coordinate positions, touch time, pressure values and the like of touch points on the main screen or the auxiliary screen. Furthermore, the input management service may determine whether to trigger a custom double-sided screen gesture event, such as a double-sided screen screenshot, a control multiple selection, an icon movement, a control deletion, and the like, according to the physical form, display state, and screen touch data of the flexible screen. When the input management service determines to trigger any double-sided screen gesture event, the input management service can call a system interface to report the double-sided screen gesture event to the upper-layer application, so that the upper-layer application completes corresponding operation.
It should be noted that the above-mentioned upper layer application refers to a registered application preset by a user and capable of responding to a double-sided screen gesture event. Specifically, the upper layer application may be an application of the mobile phone system, such as an album application, an address book application, a browser application, and the like, and may also be any third-party application, which is not limited in this application. The user can set functions of double-sided screen gesture trigger screen capture, control multi-selection, icon movement or control deletion and the like for any application according to the requirements of the user.
Similar to fig. 3a, as shown in fig. 3b, the data flow inside the android operating system is schematic. For example, the gyroscope and the acceleration sensor on the hardware layer can report detected data to the sensor driver, the sensor driver reports the data detected by the gyroscope and the acceleration sensor to the state monitoring service through the sensor service, and the state monitoring service can determine an included angle between the main screen and the auxiliary screen according to the data detected by the gyroscope and the acceleration sensor, so as to determine the physical form of the flexible screen. The touch device of the hardware layer can report the detected screen touch data to the state monitoring service through the TP driver, the camera of the hardware layer can report the detected data to the state monitoring service through the camera driver, and the infrared sensor of the hardware layer can report the detected data to the state monitoring service through the infrared driver. The state monitoring service can determine the specific orientation of the mobile phone according to the data reported by the touch device, the camera or the infrared sensor. In addition, the state monitoring service can also obtain the display parameters of the flexible screen through the display driver so as to determine the display state of the flexible screen.
The state monitoring service can report the determined physical form, display state, specific orientation of the mobile phone flexible screen and screen touch data to the input management service, and finally the input management service judges whether to trigger a self-defined double-sided screen gesture event.
Based on the hardware and software structure of the mobile phone 100, the following describes a control method of an electronic device with a flexible screen provided in the present application in detail by specific embodiments.
First, the following description will be made of the double-sided gesture operation in the following embodiments.
In the following embodiments, the flexible screen of the mobile phone is in a folded state. For example, the holding state of the user when operating with the double-sided screen gesture can be seen in fig. 4. As shown in fig. 4, the user holds the mobile phone in the folded state with one hand (left hand or right hand), and the user can perform the gesture operation with the double-sided screen by the cooperation of the thumb and the index finger of the one hand. For example, when the user uses the thumb of the right hand on the primary screen 11 to trigger a first operation and uses the index finger of the right hand on the secondary screen 12 to trigger a second operation, the operation time difference between the first operation and the second operation is smaller than the preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation, the mobile phone may determine that the gesture operation is a double-sided screen gesture operation.
The operation time of the trigger operation of the user on the screen (the main screen or the sub-screen) refers to a start time when the finger or the palm of the user touches the screen. Accordingly, the operation time difference between the first operation and the second operation is a time difference between a start time when the finger or palm of the user touches the main screen and a start time when the finger or palm of the user touches the sub-screen. For example, the preset time difference may be set to 500ms or less.
It should be noted that, the touch position of the first operation corresponds to the touch position of the second operation, which is understood that a position difference between the touch positions of the first operation and the second operation is smaller than a preset position difference. Specifically, the determination may be made by any one of the following methods:
in one implementation, the mobile phone may determine a position relationship between the coordinate origins of the main screen coordinate system and the auxiliary screen coordinate system according to the relative position relationship between the main screen and the auxiliary screen, convert the coordinate position of the touch point of the second operation on the auxiliary screen into the main screen coordinate system according to the position relationship between the coordinate origins, calculate a position difference between the touch point of the first operation on the main screen and the touch point of the second operation converted onto the main screen, and determine whether the position difference is smaller than a preset position difference. For example, the preset position difference may be set to 10 dp. Fig. 5 shows the flexible screen in the folded state, the mobile phone detects a first operation of the user on the main screen 11 and a second operation on the auxiliary screen 12, a touch point of the first operation on the main screen 11 is a touch point 1, a touch point of the second operation on the auxiliary screen 12 is a touch point 2, and the mobile phone can convert the touch point 2 on the auxiliary screen 12 to the main screen 11 according to a relative position relationship between the main screen 11 and the auxiliary screen 12, where the touch point 2 corresponds to the touch point 2'. The mobile phone can determine the position difference r according to the positions of the touch point 1 and the touch point 2' on the main screen 11, and further determine whether the position difference r is smaller than a preset position difference, and if the position difference r is smaller than the preset position difference, the mobile phone can determine that the touch positions of the first operation and the second operation correspond to each other.
In one implementation, the mobile phone may determine a position relationship between the coordinate origins of the main screen coordinate system and the auxiliary screen coordinate system according to the relative position relationship between the main screen and the auxiliary screen, convert the coordinate position of the touch point of the first operation on the main screen into the auxiliary screen coordinate system according to the position relationship between the coordinate origins, calculate a position difference between the touch point of the first operation converted onto the auxiliary screen and the touch point of the second operation on the auxiliary screen, and determine whether the position difference is smaller than a preset position difference.
In one implementation, the mobile phone may further convert a coordinate position of a touch point of the first operation on the main screen and a coordinate position of a touch point of the second operation on the auxiliary screen to a designated coordinate system, where the designated coordinate system is a coordinate system different from the main screen coordinate system and the auxiliary screen coordinate system, and the mobile phone determines whether the position difference is smaller than a preset position difference by calculating a position difference between the coordinate position of the first operation converted to the designated coordinate system and the coordinate position of the second operation converted to the designated coordinate system.
Example one
The following describes in detail a control method for a double-sided screen capture operation provided in an embodiment of the present application, with reference to the accompanying drawings, by taking a mobile phone as an example of an electronic device.
In this embodiment, when the flexible screen of the mobile phone is in a folded state and the flexible screen is in a bright screen state, the mobile phone detects a double-sided screen gesture operation for triggering a screen capture, where the double-sided screen gesture operation includes a first operation on the main screen and a second operation on the sub-screen, an operation time difference between the first operation and the second operation is smaller than a preset time difference, and a touch position of the first operation corresponds to a touch position of the second operation. Responding to the double-sided screen gesture operation, and carrying out screen capture on the flexible screen in a bright screen state by the mobile phone.
The flexible screen in the bright screen state comprises the following three conditions: the main screen is in a bright screen state, and the auxiliary screen is in a black screen state; the main screen is in a black screen state, and the auxiliary screen is in a bright screen state; the main screen and the auxiliary screen are both in a bright screen state.
Specifically, as shown in fig. 6, when the flexible screen of the mobile phone is in a folded state and the flexible screen is in a bright screen state, after the mobile phone detects a first operation on the main screen and a second operation on the auxiliary screen, it is determined whether the touch operations on the main screen and the auxiliary screen at least satisfy the following two conditions:
the operation time difference between the first operation and the second operation is smaller than the preset time difference;
the touch positions of the first operation and the second operation correspond to each other.
And if the two conditions are met, triggering the screenshot of the corresponding screen by the mobile phone.
Optionally, in some embodiments, the touch operations on the primary screen and the secondary screen should further satisfy: the operation types of the first operation and the second operation are the same as shown in fig. 6. That is, the mobile phone triggers the screen capture only when the three conditions are all satisfied, otherwise does not trigger the screen capture. It should be noted that the execution order of the three conditions may be determined to be executed simultaneously or sequentially, and the execution order of the sequential execution is not limited to the execution order described above, and the execution order of the three conditions is not limited in this embodiment.
Illustratively, the type of operation includes any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
The above-described pressing operation is also referred to as a pressure-sensitive operation, and the following conditions should be satisfied: the operation duration of the touch operation of the user on the screen is greater than or equal to the preset duration, the coordinate position of the touch point of the touch operation is not changed, and the pressing value of the touch operation is greater than or equal to the preset pressure value. The click operation described above should satisfy: the operation duration of the touch operation of the user on the screen is less than the preset duration. The double-click operation described above should satisfy: the user triggers two times of clicking operations on the screen, and the operation time interval of the two times of clicking operations is smaller than the preset time interval. The long press operation described above should satisfy the following conditions: the operation duration of the touch operation of the user on the screen is greater than or equal to the preset duration, and the pressing value of the touch operation is smaller than the preset pressure value.
In some embodiments, when the mobile phone determines to trigger the screen capture, the mobile phone may further determine a screen-lightening state of the flexible screen before the user triggers the double-sided screen gesture operation, and perform the screen capture on one screen or the whole screen in the screen-lightening state. As shown in fig. 6, when the main screen is in a bright screen state and the auxiliary screen is in a black screen state, the mobile phone can respond to a double-sided screen gesture operation to perform screenshot on the main screen; when the main screen is in a black screen state and the auxiliary screen is in a bright screen state, the mobile phone can respond to gesture operation of the double-sided screen and perform screen capture on the auxiliary screen; when the main screen and the auxiliary screen are both in a bright screen state, the mobile phone can respond to double-sided screen gesture operation and perform screen capture on the whole flexible screen.
Based on the scheme, a user does not need to call a notification bar or operate a control (such as a virtual button) on a mobile phone screen and click a corresponding screenshot control to capture the content of the screen, and the user can directly use the double-sided screen gesture operation to capture the screenshot of the flexible screen. The mobile phone can intelligently identify the current physical form (folding state or unfolding state), display state (bright screen state or black screen state) and double-sided screen gesture operation of the flexible screen, and performs corresponding screenshot, so that better use experience is provided for a user.
The usage scenario of the double-sided screen shot is described in detail below with reference to fig. 7 and 8.
In the first scenario, the flexible screen of the mobile phone is in a folded state, the main screen is in a bright screen state, the auxiliary screen is in a black screen state, and a user can perform screen capture on the main screen in the bright screen state through a double-sided screen gesture operation.
Illustratively, as shown in fig. 7, the mobile phone includes a flexible screen that can be divided into a main screen 11 and a sub-screen 12, the flexible screen being currently in a folded state. The main screen 11 faces the user and is in a bright screen state, the main screen 11 displays a main interface of the mobile phone at present, 6 application icons are displayed on the main interface, and the auxiliary screen 12 faces away from the user and is in a black screen state. Based on this, if the touch device on the flexible screen detects a first operation of the user on the primary screen and a second operation on the secondary screen, the touch device reports a touch event to a state monitoring service in the application framework layer. After receiving the touch event, the state monitoring service can call the sensor service, start the sensor corresponding to the main screen and the sensor corresponding to the auxiliary screen to detect, and can determine that the physical form of the current flexible screen is the folding state according to the detection data. Meanwhile, after receiving the touch event, the state monitoring service may further obtain display parameters of the primary screen and the secondary screen through the display driver, and determine the display state of the flexible screen before the touch event occurs, for example, the primary screen 11 shown in fig. 7 is in a bright screen state, and the secondary screen 12 is in a dark screen state. In addition, the state monitoring service may obtain touch data corresponding to the touch event through the touch device, where the touch data includes specific positions of touch points on the main screen and the auxiliary screen, operation time (start time of operation), operation duration, pressing pressure, and the like. In summary, the state monitoring service may send, to the input management service in the application framework layer, that the physical form of the current flexible screen is the folded state, and the display state of the flexible screen before the touch event is that the main screen is in the bright screen state, the sub-screen is in the black screen state, and the touch data on the main screen and the sub-screen.
Furthermore, the input management service can determine whether to perform a corresponding screenshot according to the physical form, the display state and the touch data of the flexible screen. For example, when the flexible screen is in a folded state, the main screen is in a bright screen state, the auxiliary screen is in a black screen state, an operation time difference between a first operation on the main screen and a second operation on the auxiliary screen is smaller than a preset time difference, the touch positions of the first operation and the second operation correspond, and the operation types of the first operation and the second operation are press operations, the input management service may determine that a screenshot of the main screen is currently required. Subsequently, the input management service sends a screenshot event, which is a screenshot of the home screen, to the upper application. As shown in fig. 7, the upper application performs screenshot on the main interface currently displayed on the main screen, and then the main screen 11 displays a screenshot preview interface, where the screenshot preview interface includes processing controls that can operate on the screenshot, such as "save", "edit", "cancel" and other processing controls shown in fig. 7, and the user can save, edit or delete the screenshot on the main screen 11. The processing control is only an example, and may further include other processing controls, such as a sharing control, and the like, which is not limited in this embodiment of the present application.
So, when the flexible screen is in fold condition, the main screen is in bright screen state, and vice screen is in black screen state, if the cell-phone detects the first operation of user on the main screen and the second operation on vice screen, when confirming the operation and being the operation of two-sided screen gesture, the cell-phone can carry out the screenshot to the main screen automatically, and convenience of customers carries out quick screenshot operation on folding flexible screen, promotes the user and uses the experience when using folding cell-phone.
In the second scenario, the flexible screen of the mobile phone is in a folded state, the main screen is in a black screen state, the auxiliary screen is in a bright screen state, and a user can perform screen capture on the auxiliary screen in the bright screen state through a double-sided screen gesture operation. The implementation principle and technical effect are similar to those of the first scenario of this embodiment, and are not described herein again.
In the third scenario, the flexible screen of the mobile phone is in a folded state, the main screen and the auxiliary screen are both in a bright screen state, and a user can perform screen capture on the whole flexible screen in the bright screen state through a double-sided screen gesture operation.
Illustratively, as shown in fig. 8, the flexible screen is currently in a folded state, the main screen 11 of the flexible screen faces the user and is in a bright screen state, the main screen 11 currently displays a short message list interface, the auxiliary screen 12 faces away from the user and is in a bright screen state, and the auxiliary screen 12 currently displays a chat interface with the contact Bob in the short message list. Different from the first scene, the display state of the flexible screen is that the two screens are simultaneously bright, the implementation process of the flexible screen is similar to that of the first scene, when the flexible screen is in a folded state, the main screen and the auxiliary screen are in a bright screen state, the operation time difference between the first operation on the main screen and the second operation on the auxiliary screen is smaller than the preset time difference, the touch positions of the first operation and the second operation correspond to each other, and the operation types of the first operation and the second operation are press operations, the input management service can determine that the screen capture of the whole flexible screen is required currently. Subsequently, the input management service sends a screenshot event for screenshot the entire screen to the upper application. As shown in fig. 8, the upper layer application performs screenshot on the display interface (the display interface of the main screen 11 and the sub-screen 12) of the entire flexible screen, and similar to the first scenario, the main screen 11 displays a screenshot preview interface, the screenshot preview interface includes a processing control capable of operating on the screenshot, and the user can save, edit or delete the screenshot of the entire flexible screen on the main screen 11.
To sum up, after the mobile phone performs screen capture on the flexible screen in the bright screen state, the mobile phone may display a screen capture preview interface on the screen (the main screen or the auxiliary screen) in the bright screen state. For example, the screenshot preview interface may include at least one of the following processing controls: save controls, edit controls, share controls, or cancel controls.
Correspondingly, after the screen capture preview interface is displayed, when the mobile phone detects a third operation for triggering selection of one processing control on the screen capture preview interface, the mobile phone can execute execution processing corresponding to the selected processing control on the screen capture, such as saving, editing, sharing or canceling and the like, in response to the third operation.
In other embodiments, the mobile phone may determine whether to trigger the screenshot according to the physical form, display state, touch data, and specific orientation of the current flexible screen. Specifically, when the mobile phone determines that the flexible screen is in a folded state, the main screen and the auxiliary screen are both in a bright screen state, and the first operation on the main screen and the second operation on the auxiliary screen are double-sided screen gesture operations, the mobile phone can further perform screen capture on the corresponding screen according to the specific orientation of the current mobile phone, for example, the current screen facing the user is the main screen, and even if the whole flexible screen is in the bright screen state, the mobile phone only performs screen capture on the main screen facing the user.
Specifically, the mobile phone may further call a camera, an infrared sensor, a proximity light sensor, or a touch device to identify a specific orientation of the mobile phone. For example, cameras may be respectively installed on the main screen and the auxiliary screen of the mobile phone, and if the camera of the main screen captures the face information and the camera of the auxiliary screen does not capture the face information, the state monitoring service may determine that the flexible screen in the current folded state is in a state where the main screen faces the user. For another example, infrared sensors may be respectively installed on the main screen and the sub-screen of the mobile phone, and if the infrared sensor of the sub-screen captures an infrared signal radiated by a human body and the infrared sensor of the main screen does not capture an infrared signal radiated by a human body, the state monitoring service may determine that the flexible screen in the current folded state is in a state where the sub-screen faces the user. For another example, the mobile phone may further determine, by using a preset grip algorithm, a grip gesture of the user gripping the mobile phone currently according to the touch position of the user on the main screen and/or the sub-screen reported by the touch device. Then, in conjunction with the grip posture of the handset, the condition monitoring service can also determine the particular orientation of the handset. For example, after the touch device detects a touch event, the coordinates of the touch point may be reported to the state monitoring service. The state monitoring service determines the holding posture of the mobile phone by counting the positions and the number of the touch points in the touch device. For example, if it is detected that the number of touch points falling in the primary screen is greater than a preset value, indicating that the user's fingers and palm are gripping on the primary screen, the screen facing the user at this time is the secondary screen. Correspondingly, if the number of the touch points falling into the secondary screen is detected to be larger than the preset value, the fact that the fingers and the palm of the user are gripped on the secondary screen is indicated, and the screen facing the user at the moment is the primary screen.
It should be noted that the mobile phone can simultaneously use one or more sensors to identify the specific orientation of the mobile phone.
In the above embodiment, because the judgment of the specific orientation of the mobile phone is added, even if the whole screen of the mobile phone is in the bright screen state in the folded state, the mobile phone only performs screenshot on a single screen (the main screen or the auxiliary screen) facing the user after detecting the gesture operation of the double-sided screen.
In still other embodiments, the handset may be further responsive to any of: the method comprises the following steps that a user double-clicks a power key, double-clicks a virtual key on a screen, simultaneously presses the power key and a volume key, and the like, and triggers screenshot of a corresponding screen. After the display state of the flexible screen before the operation is determined, the mobile phone captures the display interface of the screen (the main screen, the auxiliary screen or the whole flexible screen) in the bright screen state. It should be noted that the physical form of the flexible screen is not limited in this example, that is, the physical form of the flexible screen may be an unfolded state, a folded state or a support state.
Illustratively, table 1 shows that the mobile phone determines whether to trigger the response of the screenshot according to the physical form of the flexible screen, the display state, the touch data on the primary screen and the secondary screen, and the specific orientation of the mobile phone.
TABLE 1
Figure BDA0002221581560000181
It should be noted that the "do not trigger the screen shot" in table 1 may be understood as that the mobile phone does not respond (i.e., the mobile phone does not perform any processing), or the mobile phone performs processing according to a single-screen gesture operation. For example, when it is determined that the flexible screen of the mobile phone is in a folded state, the main screen is bright, the sub screen is dark, and the position difference between the touch positions of the first operation on the main screen and the second operation on the sub screen is both pressing operations, if the operation time difference between the first operation and the second operation is greater than or equal to the preset time difference, the mobile phone may perform corresponding event processing according to the operations of the user on the main screen and the sub screen, respectively, for example, the user presses an application icon on the main screen main interface, and the mobile phone may respond to the pressing operation of the main screen and display a shortcut function of the application icon; and when the user presses the auxiliary screen in the black screen state, the mobile phone can wake up the auxiliary screen in response to the pressing operation of the auxiliary screen.
It should be noted that, in this embodiment of the present application, when the main screen and/or the sub-screen of the mobile phone are in a bright screen state, the display interface of the screen may be a screen locking interface, a specific interface in a certain application (for example, a WeChat, an album, and the like), and this is not limited in this embodiment of the present application.
Example two
The following describes in detail a control method for a dual-screen multi-selection operation provided in an embodiment of the present application with reference to the drawings by taking a mobile phone as an example of an electronic device.
In this embodiment, when the flexible screen of the mobile phone is in a folded state and the display interface of the main screen of the flexible screen displays at least two selectable controls, the mobile phone detects a double-sided screen gesture operation for triggering multiple selections, where the double-sided screen gesture operation includes a first operation on the main screen and a second operation on the auxiliary screen, an operation time difference between the first operation and the second operation is smaller than a preset time difference, and a touch position of the first operation corresponds to a touch position of the second operation; and responding to the gesture operation of the double-sided screen, and controlling the display interface of the main screen to enter a multi-selection mode by the mobile phone. Wherein, in the multi-selection mode, at least two selectable controls on the display interface of the home screen can both be selected.
It should be noted that the selectable control refers to a control that can be selected and processed on the display interface of the screen. For example, the selectable control may be a picture control on a picture browsing interface in an album, and the user may select multiple picture controls at the same time to splice multiple pictures (i.e., edit pictures), or send multiple pictures to friends in batch (i.e., share pictures), or add multiple pictures to a new album folder (i.e., sort pictures). The multi-selection mode is an operation mode in which a user can select at least two selectable controls on a display interface of a screen and perform batch processing.
Specifically, as shown in fig. 9, when the flexible screen of the mobile phone is in a folded state and the display interface of the main screen of the flexible screen displays at least two selectable controls, after detecting a first operation on the main screen and a second operation on the sub-screen, the mobile phone may determine whether the touch operations on the main screen and the sub-screen at least satisfy the following two conditions according to the touch positions and the operation times of the first operation and the second operation on the screen:
the operation time difference between the first operation and the second operation is smaller than the preset time difference;
the touch positions of the first operation and the second operation correspond to each other.
And if the two conditions are met, the mobile phone controls the display interface of the main screen to enter a multi-selection mode.
Optionally, in some embodiments, the touch operations on the primary screen and the secondary screen should further satisfy: the operation types of the first operation and the second operation are the same, as shown in fig. 9. That is, the mobile phone triggers the display interface of the main screen to enter the multi-selection mode only when the three conditions are all satisfied, otherwise the display interface of the main screen does not enter the multi-selection mode.
It should be noted that the execution order of the three conditions may be determined to be executed simultaneously or sequentially, and the execution order of the sequential execution is not limited to the execution order described above, and the execution order of the three conditions is not limited in this embodiment.
Illustratively, the type of operation includes any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation. For the definition and judgment of each operation type, see above, and are not described herein.
In some embodiments, when determining that the display interface of the main screen is triggered to enter the multi-selection mode, the mobile phone may further determine whether a touch position of a first operation of the user on the main screen corresponds to the first selectable control, as shown in fig. 9, if the touch position of the first operation corresponds to the first selectable control, the first selectable control is one of selectable controls displayed on the main screen, and in response to a double-sided screen gesture operation, the mobile phone may control the display interface of the main screen to enter the multi-selection mode and display the first selectable control as a selected one. In the mode, the user can directly select one selectable control on the main screen while triggering the display interface of the main screen to enter the multi-selection mode.
In this embodiment, the display interface of the main screen of the flexible screen displays at least two selectable controls, which includes the following situations: in the first case, the display interface of the main screen of the flexible screen displays at least two selectable controls (i.e., the main screen is in a bright screen state), and the auxiliary screen of the flexible screen is in a black screen state. In the second case, the display interface of the main screen of the flexible screen displays at least two selectable controls, and the secondary screen of the flexible screen is in a bright screen state, but the display interface of the secondary screen does not have selectable controls. In the third case, at least two selectable controls are displayed on the display interfaces of the main screen and the auxiliary screen of the flexible screen.
Under the former two conditions, the folded mobile phone detects the double-sided screen gesture operation for triggering multi-selection, and the mobile phone can control the display interface of the main screen to enter a multi-selection mode in response to the double-sided screen gesture operation.
In the third case, the folded mobile phone detects a double-sided screen gesture operation for triggering multiple selection, and in response to the double-sided screen gesture operation, the mobile phone may control the display interface of the user-oriented screen to enter a multiple selection mode, where the user-oriented screen is a primary screen or a secondary screen. That is to say, when the display interfaces of the two screens of the folded mobile phone have selectable controls, the mobile phone can determine which display interface of the screen enters the multiple selection mode according to the specific orientation of the current mobile phone, if the screen facing the user is the main screen, the mobile phone controls the display interface of the main screen to enter the multiple selection mode, and if the screen facing the user is the auxiliary screen, the mobile phone controls the display interface of the auxiliary screen to enter the multiple selection mode.
In some embodiments, in response to the double-sided screen gesture operation, the mobile phone controls the display interface of the main screen to enter a multiple-selection mode, including: responding to the gesture operation of the double-sided screen, and if the main screen is a screen facing the user, controlling the display interface of the main screen to enter a multi-selection mode by the mobile phone. In the implementation mode, the mobile phone also needs to detect the specific orientation of the current mobile phone, and further determines whether to control the display interface of the main screen to enter the multi-selection mode according to the specific orientation of the mobile phone. That is, if the screen to be controlled of the mobile phone faces away from the user, the mobile phone does not trigger the corresponding control, thereby avoiding the misoperation of the user.
Illustratively, if the mobile phone is in a folded state, at least two selectable controls are displayed on the display interface of the main screen, the auxiliary screen is in a blank screen state, or the auxiliary screen is in a bright screen state but no selectable control is displayed on the display interface, at this time, the mobile phone detects a gesture operation of the double-sided screen and needs to confirm the specific orientation of the current mobile phone, and if the screen facing the user is not the main screen displaying the selectable controls, the mobile phone will not trigger the display interface of the main screen to enter a multi-selection mode.
In some embodiments, after the mobile phone controls the display interface of the main screen to enter the multiple selection mode in response to the double-sided screen gesture operation, the method further includes: the mobile phone detects a third operation for triggering selection of a selectable control on a display interface of the main screen, wherein the operation type of the third operation is click operation or sliding operation; and responding to the third operation, and displaying one or more controls corresponding to the third operation as selected by the mobile phone. The user can add one or more selectable controls to complete the multi-selection operation through the implementation mode.
Based on the scheme, when a user needs to perform batch processing on at least two selectable controls on the display interface of the folded flexible screen, the mobile phone can control the display interface of the flexible screen of the mobile phone to rapidly enter a multi-selection mode according to the detected gesture operation of the double-sided screen. In this way, the user need not enter the multiple selection mode by clicking on a "select" control at a specified location (e.g., the upper left corner of the screen). The double-sided screen gesture operation is more convenient and faster, and better use experience is provided for users.
The following describes in detail a usage scenario of the dual-screen multi-selection operation with reference to fig. 10 and 11.
In the first scenario, the flexible screen of the mobile phone is in a folded state, at least two selectable controls are displayed on the display interface of the main screen (the main screen is in a bright screen state), and the auxiliary screen is in a black screen state, so that a user can trigger the display interface of the main screen to enter a multi-selection mode through double-sided screen gesture operation, and therefore multi-selection operation is completed.
Illustratively, as shown in fig. 10, the flexible screen of the mobile phone is divided into a main screen 11 and a sub-screen 12, and the flexible screen is currently in a folded state. The main screen 11 faces the user and is in a bright screen state, the current display interface of the main screen 11 is a picture browsing interface, 9 picture controls are displayed on the interface, and the auxiliary screen 12 faces away from the user and is in a black screen state. Based on this, if the touch device on the flexible screen detects a first operation of the user on the primary screen and a second operation on the secondary screen, similar to the above embodiment, after acquiring the physical form and the display state of the current flexible screen and the touch data on the primary screen and the secondary screen, the state monitoring service may send the physical form and the display state of the current flexible screen and the touch data on the primary screen and the secondary screen to the input management service in the application framework layer.
Furthermore, the input management service can determine whether to trigger the display interface of the main screen to perform the multi-selection mode according to the physical form, the display state and the touch data of the flexible screen. For example, when the flexible screen is in a folded state, at least two selectable controls are displayed on the display interface of the main screen, the auxiliary screen is in a black screen state, the operation time difference between a first operation on the main screen and a second operation on the auxiliary screen is smaller than a preset time difference, the touch positions of the first operation and the second operation correspond, and the operation types of the first operation and the second operation are click operations, the input management service may determine that the display interface of the current main screen needs to enter a multi-selection mode. Subsequently, the input management service sends an event to the upper layer application instructing the display interface of the home screen to enter the multiple selection mode. In some embodiments, the input management service may further determine whether the first operation on the home screen corresponds to a selectable control on the display interface of the home screen, and if the first operation corresponds to one selectable control, as shown in fig. 10, the first operation corresponds to a second row and a first column of picture controls on the picture browsing interface, and the upper layer application triggers the display interface of the home screen to enter a multiple selection mode according to an event indication of the input management service, and selects a picture selected by the user on the home screen.
Therefore, when the flexible screen is in a folded state, at least two selectable controls are displayed on the display interface of the main screen (the main screen is in a bright screen state), and the auxiliary screen is in a black screen state, if the mobile phone detects a first operation of the user on the main screen and a second operation of the user on the auxiliary screen, when the operation is determined to be a double-sided screen gesture operation, the display interface of the main screen of the mobile phone can enter a multi-selection mode quickly, if the first operation of the user on the main screen also corresponds to one selectable control, the mobile phone can simultaneously select the selectable control selected by the user on the main screen, the user can conveniently and quickly perform multi-selection operation on the folded flexible screen, and the use experience of the user when the folding mobile phone is used is improved.
In a second scenario, the flexible screen of the mobile phone is in a folded state, at least two selectable controls (the secondary screen is in a bright screen state) are displayed on the display interface of the secondary screen, and the main screen is in a black screen state, so that a user can perform multi-selection operation on the selectable controls on the secondary screen in the bright screen state through a double-sided screen gesture operation. The implementation principle and technical effect are similar to those of the first scenario of this embodiment, and are not described herein again.
In a third scenario, a flexible screen of the mobile phone is in a folded state, at least two selectable controls are displayed on display interfaces of a main screen and an auxiliary screen, the mobile phone responds to gesture operation of a double-sided screen and controls the display interface facing a screen of a user to enter a multi-selection mode, and the user can perform multi-selection operation in the multi-selection mode.
Illustratively, as shown in fig. 11, the flexible screen is currently in a folded state, the main screen 11 of the flexible screen faces the user, the main screen 11 currently displays a main interface, the main interface has 6 application icon controls, the auxiliary screen 12 faces away from the user, the auxiliary screen 12 currently displays an album browsing interface, and the album browsing interface has 12 picture controls. Based on the method, after receiving touch operations of a user on the main screen and the auxiliary screen, the state monitoring service sends the physical form, the display state and the touch data of the current flexible screen to the input management service in the application program framework layer, and also sends the specific orientation of the mobile phone. See the above embodiments for how the condition monitoring service determines the specific orientation of the handset. Furthermore, the input management service can determine which screen display interface enters the multi-selection mode according to the physical form, the display state, the touch data and the specific orientation of the mobile phone of the flexible screen. As shown in fig. 11, the current screen facing the user is the main screen 11, the mobile phone controls the display interface of the main screen 11 to enter the multiple selection mode, and since the first operation on the main screen 11 corresponds to the application icon 4, the display interface of the main screen 11 enters the multiple selection mode while the application icon 4 is simultaneously selected, and at this time, the sub-screen 12 facing away from the user does not respond (the display interface is unchanged).
Illustratively, table 2 shows the response of the mobile phone to determine whether to trigger the interface to enter the multi-selection mode according to the physical form of the flexible screen, the display state, the touch data on the primary screen and the secondary screen, and the specific orientation of the mobile phone.
TABLE 2
Figure BDA0002221581560000211
Figure BDA0002221581560000221
It should be noted that, no matter "the display interface of the main screen enters the multi-selection mode" or "the display interface of the sub-screen enters the multi-selection mode" in table 2, the user is not required to select the control on the interface corresponding to the touch operations on the main screen and the sub-screen (that is, the touch operations may be at the blank position of the display interface). If the touch operation on the main screen or the auxiliary screen corresponds to a certain selectable control on the interface, the selectable control can be directly displayed as a selection while the display interface of the main screen or the auxiliary screen enters the multi-selection mode.
It should be noted that, the precondition for the display interface of the trigger screen to enter the multiple selection mode in the embodiment of the present application is that: at least two multi-selection controls are displayed on a display interface of the screen (the main screen and/or the auxiliary screen). The display interface of the screen may be a specific interface of an application, for example, a picture browsing interface of an album application, where the picture browsing interface displays a plurality of picture controls, and may also be a main interface of a mobile phone, where the main interface displays a plurality of application icon controls.
EXAMPLE III
The following describes in detail a control method for a double-sided screen icon moving operation provided in an embodiment of the present application with reference to the drawings by taking a mobile phone as an example of an electronic device.
In this embodiment, when the flexible screen of the mobile phone is in a folded state, both display interfaces of the main screen and the sub-screen of the flexible screen are interfaces capable of accommodating an icon control (the main screen and the sub-screen are both in a bright screen state), and the icon control is displayed on the display interface of the main screen, the mobile phone detects a double-sided screen gesture operation for triggering the icon to move, the double-sided screen gesture operation includes a first operation on the main screen and a second operation on the sub-screen, wherein an operation time difference between the first operation and the second operation is smaller than a preset time difference, a touch position of the first operation corresponds to a touch position of the second operation, and the first operation corresponds to a first icon on the main screen. And responding to the double-sided screen gesture operation, and the mobile phone moves the first icon control from the main screen to the auxiliary screen.
It should be noted that the icon control is an icon control that can be moved, and the icon controls on the display interface may be arranged according to forms such as a nine-square grid, a sixteen-square grid, and the like. Specifically, the icon control may be an icon control of an application program on a main interface of the mobile phone, or an icon control in a certain application, which is not limited in this application.
Specifically, as shown in fig. 12, when the flexible screen of the mobile phone is in a folded state, the display interfaces of the main screen and the sub-screen of the flexible screen are both interfaces capable of accommodating the icon control, and the icon control is displayed on the display interface of the main screen, after the mobile phone detects a first operation on the main screen and a second operation on the sub-screen, it can be determined whether the touch operations on the main screen and the sub-screen at least satisfy the following three conditions according to the touch positions and the operation times of the first operation and the second operation on the screen:
the operation time difference between the first operation and the second operation is smaller than the preset time difference;
the touch positions of the first operation and the second operation correspond to each other;
the first operation corresponds to a first icon control on the home screen.
And if the three conditions are met, the mobile phone moves the first icon control from the main screen to the auxiliary screen.
Optionally, in some embodiments, the touch operations on the primary screen and the secondary screen should further satisfy: the operation types of the first operation and the second operation are the same, as shown in fig. 12. That is, the mobile phone triggers the first icon control to be moved from the main screen to the auxiliary screen only when it needs to determine that the above four conditions are all satisfied. It should be noted that the execution order of the four conditions may be determined to be executed simultaneously or sequentially, and the execution order of the sequential execution is not limited to the execution order described above, and the execution order of the four conditions is not limited in this embodiment.
Illustratively, the type of operation includes any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation. For the definition and judgment of each operation type, see above, and are not described herein.
In some embodiments, in response to the dual-sided screen gesture operation, the cell phone moving the first icon control from the primary screen to the secondary screen includes: responding to the gesture operation of the double-sided screen, and if the main screen faces the user, the mobile phone moves the first icon control from the main screen to the auxiliary screen. In this implementation, the mobile phone further needs to detect a specific orientation of the current mobile phone, and further determines whether to move the first icon control according to the specific orientation of the mobile phone. That is to say, if the mobile phone determines that the screen where the first icon control to be moved is located faces away from the user, the mobile phone does not trigger the icon to move, so that misoperation of the user is avoided.
It should be noted that, the display interfaces of the main screen and the sub-screen of the flexible screen are both interfaces capable of accommodating icon controls, and the icon controls are displayed on the display interface of the main screen of the flexible screen, which includes the following two situations: in the first case, an icon control is displayed on the display interface of the main screen of the flexible screen (i.e., the main screen is in a bright screen state), and the display interface of the auxiliary screen of the flexible screen is an empty screen (i.e., no icon control). At this time, the mobile phone detects a double-sided screen gesture operation for triggering the icon to move, and in response to the double-sided screen gesture operation, the mobile phone can move the first icon control from the main screen to the auxiliary screen. In the second case, an icon control is displayed on the display interface of the main screen of the flexible screen, and an icon control is displayed on the display interface of the auxiliary screen of the flexible screen. At the moment, the mobile phone detects a double-sided screen gesture operation for triggering icon movement, and if only the first operation on the main screen corresponds to the first icon control, the mobile phone moves the first icon control from the main screen to the auxiliary screen; if the first operation on the main screen corresponds to the first icon control and the second operation on the auxiliary screen corresponds to the second icon control, the mobile phone can control the icon control on the screen facing the user to move to the screen facing away from the user, namely, the mobile phone needs to determine which icon control on the screen is moved across screens according to the specific orientation of the current mobile phone, if the screen facing the user is the main screen, the mobile phone moves the first icon control to the auxiliary screen from the main screen, and if the screen facing the user is the auxiliary screen, the mobile phone moves the second icon control to the main screen from the auxiliary screen.
In some embodiments, when the display interfaces of the main screen and the sub-screen are both interfaces capable of accommodating icon controls, at least one icon control is displayed on the display interface of the main screen, and at least one icon control is displayed on the display interface of the sub-screen, in response to a double-sided screen gesture operation, the mobile phone moves the first icon control from the main screen to the front of the at least one icon control on the sub-screen, and sequentially moves the at least one icon control on the sub-screen backwards; or in response to the double-sided screen gesture operation, the mobile phone moves the first icon control from the main screen to the back of the at least one icon control on the auxiliary screen.
This implementation defines the location at which the first icon control moves to the secondary screen, and if there are multiple icon controls originally on the secondary screen, the first icon control can be moved to the foremost or rearmost of the multiple icon controls. For example, two icon controls a and B are originally arranged in sequence on the secondary screen, a is located in front of B, the mobile phone can move the icon control C on the primary screen to the position of a, and a and B move backwards in sequence, or C moves to the back of B, and the positions of a and B on the secondary screen are unchanged.
In some embodiments, the mobile phone moving the first icon control from the primary screen to the secondary screen includes: and the mobile phone moves the first icon control from the main screen to the touch position of the second operation of the auxiliary screen. It should be noted that there may be no icon control at the touch position of the second operation of the secondary screen, and there may also be an icon control. If the icon control does not exist, the mobile phone can directly move the first icon control to the touch position of the second operation of the secondary screen. If there is an icon control, see the scheme below.
In some embodiments, if there is a third icon control at the touch position of the secondary screen, the moving, by the mobile phone, the first icon control from the primary screen to the touch position of the secondary screen for the second operation includes: the mobile phone moves the first icon control from the main screen to a touch position of a second operation on the auxiliary screen, and moves a third icon control on the auxiliary screen backwards; or the mobile phone moves the first icon control from the main screen to the touch position of the second operation of the auxiliary screen, and merges the third icon control and the first icon control on the auxiliary screen.
That is to say, if there is an icon control at the touch position of the second operation of the secondary screen, the moved first icon control is still placed at the touch position, and the mobile phone needs to move all the icon controls behind the touch position backward in sequence, or the moved first icon control and the icon control at the touch position are directly merged into the same folder.
Based on the scheme, when the icon controls on the display interface of the flexible screen need to move across the screen, if the current physical form of the flexible screen is a folding state, the icon controls can be contained in the double-screen display interface, and the mobile phone can control the icon controls corresponding to the operation to move across the screen according to the double-side screen gesture operation of the user on the flexible screen. In this way, the user can move the icon control corresponding to the operation to another screen without unfolding the mobile phone. The double-sided screen gesture operation is more convenient and faster, and better use experience is provided for users.
The following describes in detail a usage scenario of the double-sided screen icon moving operation with reference to fig. 13a to 13 c.
In the first scenario, the flexible screen of the mobile phone is in a folded state, the display interfaces of the main screen and the auxiliary screen are both interfaces capable of containing the icon control, and when the icon control is displayed on the display interface of the main screen, a user can move the icon control on the main screen to the auxiliary screen through double-sided screen gesture operation.
Illustratively, as shown in fig. 13a, the flexible screen of the mobile phone is divided into a main screen 11 and a sub-screen 12, and the flexible screen is currently in a folded state. The main screen 11 faces a user and is in a bright screen state, the main screen 11 currently displays a first main interface of the mobile phone, and the first main interface comprises 4 application icons. The sub-screen 12 faces away from the user and is in a bright screen state, and the sub-screen 12 displays a second main interface of the mobile phone, wherein the second main interface comprises 3 application program icons. Based on this, if the touch device on the flexible screen detects a first touch operation of the user on the main screen 11 and a second touch operation on the sub-screen 12, similar to the above embodiment, after acquiring the physical form and the display state of the current flexible screen and the touch data on the main screen 11 and the sub-screen 12, the state monitoring service may send the physical form and the display state of the current flexible screen and the touch data on the main screen 11 and the sub-screen 12 to the input management service in the application framework layer.
Further, the input management service may determine whether to trigger the icon to move across the screen based on the physical form, display state, and touch data of the flexible screen. For example, when the flexible screen is in a folded state, icon controls are displayed on display interfaces of the main screen and the sub screen, an operation time difference between a first operation on the main screen and a second operation on the sub screen is smaller than a preset time difference, touch positions of the first operation and the second operation correspond to each other, the icon control corresponding to the first operation on the main screen (for example, the application icon 4 on the main screen in fig. 13 a) and operation types of the first operation and the second operation are the same (for example, long press operation in fig. 13 a), the input management service may determine that the icon control corresponding to the first operation on the current main screen needs to be moved to the sub screen, and the input management service may send an event for moving the icon of the main screen to the upper-layer application. As shown in fig. 13a, the cellular phone can move the application icon 4 on the main screen 11 to a corresponding position on the sub-screen 12. If the application icon 8 is already in the corresponding position on the sub-screen 12, as shown in fig. 13b, the mobile phone may merge the application icon 4 and the application icon 8 on the main screen 11 into one folder, or, as shown in fig. 13c, the mobile phone may move the application icon 4 on the main screen 11 to the position of the application icon 8, and the application icons 8 move backward in sequence.
So, be in fold condition when the flexible screen, when all showing the icon control on the display interface of main screen and vice screen, if the cell-phone detects first operation and the second operation of user on main screen and vice screen, certain icon control on first operation correspondence main screen, the cell-phone is when confirming above-mentioned operation is two-sided screen gesture operation, can be with the icon control quick travel to the corresponding position of vice screen on the main screen, convenience of customers is in the cross-screen removal of icon control on the screen under fold condition at the cell-phone, promote the user and use when using folding cell-phone and experience.
In the second scenario, the flexible screen of the mobile phone is in a folded state, the display interfaces of the main screen and the auxiliary screen are both interfaces capable of containing the icon controls, and when the icon controls are displayed on the display interface of the auxiliary screen, a user can move the icon controls on the auxiliary screen to the main screen through double-sided screen gesture operation. The implementation principle and technical effect are similar to those of the first scenario of this embodiment, and are not described herein again.
In a third scenario, the flexible screen of the mobile phone is in a folded state, the display interfaces of the main screen and the auxiliary screen are both interfaces capable of containing the icon controls, and when the icon controls are displayed on the display interfaces of the main screen and the auxiliary screen, a user can move the icon controls on the main screen (or the auxiliary screen) to the auxiliary screen (or the main screen) through double-sided screen gesture operation. In this scenario, after receiving the touch operations of the user on the primary screen and the secondary screen, the state monitoring service sends the physical form, the display state, and the touch data of the current flexible screen to the input management service in the application framework layer, and also sends the specific orientation of the mobile phone. See the above embodiments for how the condition monitoring service determines the specific orientation of the handset.
Furthermore, the input management service may determine whether to trigger the icon to move across the screen according to the physical form, display state, touch data of the flexible screen, and the specific orientation of the mobile phone. For example, when the flexible screen is in a folded state, the display interfaces of the main screen and the sub-screen are both interfaces capable of accommodating icon controls, the touch operations on the main screen and the sub-screen both correspond to the icon controls, the operation time difference between a first operation on the main screen and a second operation on the sub-screen is smaller than a preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the operation types of the first operation and the second operation are long-press operations, the input management service can further determine whether to move the icon control of the main screen to the sub-screen or move the icon control of the sub-screen to the main screen according to the specific orientation of the mobile phone. If the current main screen faces the user, the mobile phone moves the icon control of the main screen across screens; and if the current auxiliary screen faces the user, the mobile phone moves the icon control of the auxiliary screen across screens.
Illustratively, table 3 shows the response of the mobile phone to determine whether to trigger the icon movement according to the physical form of the flexible screen, the display state, the touch data on the main screen and the sub-screen, and the specific orientation of the mobile phone.
TABLE 3
Figure BDA0002221581560000251
Figure BDA0002221581560000261
Example four
The following describes in detail a control method for a double-sided screen deletion operation provided in an embodiment of the present application, with reference to the accompanying drawings, by taking a mobile phone as an example of an electronic device.
In this embodiment, when a flexible screen of a mobile phone is in a folded state and a selectable control is displayed on a main screen of the flexible screen, the mobile phone detects a double-sided screen gesture operation for triggering deletion of the control, where the double-sided screen gesture operation includes a first operation on the main screen and a second operation on a secondary screen, an operation time difference between the first operation and the second operation is smaller than a preset time difference, a touch position of the first operation corresponds to a touch position of the second operation, the first operation corresponds to the first selectable control on the main screen, and operation types of the first operation and the second operation are both sliding operations and sliding directions are the same; and responding to the gesture operation of the double-sided screen, and deleting the first selectable control on the main screen by the mobile phone.
It should be noted that the selectable control refers to a control that can be deleted on a display interface of the screen, for example, the selectable control may be a session box control in a chat list interface, or a folder control in a file list interface, or an application icon control in a main interface of the mobile phone, and the like. The sliding direction may be a horizontal direction, a vertical direction, or other directions having a certain included angle with the horizontal direction or the vertical direction, which is not limited in this application.
Specifically, as shown in fig. 14, when the flexible screen of the mobile phone is in a folded state and a selectable control is displayed on the display interface of the main screen of the flexible screen, after detecting a first operation on the main screen and a second operation on the sub-screen, the mobile phone may determine whether the touch operations on the main screen and the sub-screen satisfy the following four conditions according to the touch positions and the operation times of the first operation and the second operation on the screen:
the operation time difference between the first operation and the second operation is smaller than the preset time difference;
the touch positions of the first operation and the second operation correspond to each other;
the first operation corresponds to a first selectable control on the main screen;
the operation types of the first operation and the second operation are sliding operations, and the sliding directions are consistent.
And if the four conditions are met, deleting the first selectable control on the main screen by the mobile phone, otherwise, not triggering the deletion of the first selectable control by the mobile phone.
It should be noted that the execution order of the four conditions may be determined to be executed simultaneously or sequentially, the execution order of sequential execution is not limited to the execution order, and the execution order of the three conditions is not limited in this embodiment.
In some embodiments, in response to the double-sided screen gesture operation, the mobile phone deletes the first selectable control on the home screen, including: responding to the gesture operation of the double-sided screen, and if the main screen is a screen facing the user, deleting the first selectable control on the main screen by the mobile phone. In this implementation, the mobile phone further needs to detect the specific orientation of the current mobile phone, and further determines whether to delete the first selectable control on the home screen according to the specific orientation of the mobile phone. That is to say, if the mobile phone determines that the screen where the first selectable control to be deleted is located faces away from the user, the mobile phone does not trigger the deletion control, so that misoperation of the user is avoided.
It should be noted that, selectable controls are displayed on the display interface of the main screen of the flexible screen, which includes the following situations: in the first case, a selectable control is displayed on the display interface of the main screen of the flexible screen (i.e., the main screen is in a bright screen state), and the auxiliary screen of the flexible screen is in a black screen state. At this time, the mobile phone detects a double-sided screen gesture operation for triggering the deletion control, and in response to the double-sided screen gesture operation, the mobile phone can delete the selectable control on the main screen. In the second case, the selectable control is displayed on the display interface of the main screen of the flexible screen, the auxiliary screen of the flexible screen is in a bright screen state, and the display interface of the auxiliary screen does not have the selectable control. And in the third situation, selectable controls are displayed on the display interfaces of the main screen and the auxiliary screen of the flexible screen. At this time, the mobile phone detects a double-sided screen gesture operation for triggering deletion of the control, and if the first operation on the main screen corresponds to the first selectable control and the second operation on the auxiliary screen corresponds to the second selectable control, the mobile phone deletes the selectable control corresponding to the operation on the screen facing the user. That is, the handset needs to determine which on-screen selectable control should be deleted depending on the particular orientation of the current handset. If the user-oriented screen is a main screen, deleting a first selectable control on the main screen by the mobile phone; and if the screen facing the user is the auxiliary screen, deleting the second selectable control on the auxiliary screen by the mobile phone.
In some embodiments, after the mobile phone deletes the first selectable control on the home screen in response to the double-sided screen gesture operation, the method further includes: if one or more third selectable controls are arranged below the first selectable control, the mobile phone moves the one or more third selectable controls upwards in sequence.
Based on the scheme, when the selectable control on the display interface of the flexible screen needs to be deleted, if the current physical form of the flexible screen is the folding state, the mobile phone can delete the selectable control corresponding to the operation according to the gesture operation of the user on the double-sided screen of the flexible screen. Therefore, the user can slide the selectable control to the edge of the screen or a designated area on the screen by using a double-side screen gesture, so that the selectable control can be quickly deleted, and the mistaken deletion operation of the control by the single-screen sliding of the user can be avoided.
The following describes in detail a usage scenario of the double-sided screen deletion operation with reference to fig. 15a, 15b, and 16.
In the first scenario, the flexible screen of the mobile phone is in a folded state, the selectable control is displayed on the display interface of the main screen (the main screen is in a bright screen state), and the auxiliary screen is in a black screen state, so that the user can delete the selectable control on the main screen through double-sided screen gesture operation.
Illustratively, as shown in fig. 15a or fig. 15b, the flexible screen of the mobile phone is divided into a main screen 11 and a sub-screen 12, and the flexible screen is currently in a folded state. The main screen 11 faces the user and is in a bright screen state, the main screen 11 currently displays a chat list interface of WeChat, the interface comprises 3 session frame controls, and the auxiliary screen 12 faces away from the user and is in a black screen state. Based on this, if the touch device on the flexible screen detects a first operation of the user on the main screen 11 and a second operation on the sub-screen 12, similar to the above embodiment, after acquiring the physical form and the display state of the current flexible screen and the touch data on the main screen 11 and the sub-screen 12, the state monitoring service may send the physical form and the display state of the current flexible screen and the touch data on the main screen 11 and the sub-screen 12 to the input management service in the application framework layer.
Furthermore, the input management service can determine whether to trigger the deletion operation according to the physical form, the display state and the touch data of the flexible screen. For example, when the flexible screen is in a folded state, the selectable control is displayed on the display interface of the main screen, the auxiliary screen is in a black screen state, the operation time difference between a first operation on the main screen and a second operation on the auxiliary screen is smaller than a preset time difference, the touch positions of the first operation and the second operation correspond to each other, the first selectable control on the main screen corresponding to the first operation and the operation types of the first operation and the second operation are sliding operations, and the sliding directions are consistent, the input management service may determine that the first selectable control on the main screen needs to be deleted. The input management service then sends an event to the upper-level application indicating deletion of the first selectable control on the home screen. As shown in fig. 15a, the user slides the dialog box 2 on the home screen by a first distance from left to right using a double-sided screen gesture operation, or slides the dialog box 2 on the home screen by a second distance from top to bottom, or slides the dialog box 2 to the lower edge of the screen, as shown in fig. 15b, and the mobile phone may delete the dialog box 2 selected by the user from the display interface of the home screen and move other dialog boxes (e.g. dialog box 3) below the dialog box 2 upwards in sequence in response to the double-sided screen gesture operation.
So, be in fold condition when the flexible screen, it has optional controlling part (the main screen is in bright screen state) to show on the display interface of main screen, and the vice screen is in under the black screen state, if the cell-phone detects the first operation of user on the main screen and the second operation on the vice screen, a certain optional controlling part on the main screen of operation correspondence, the cell-phone is when confirming above-mentioned operation is two-sided screen gesture operation, can delete the optional controlling part that the user selected on the main screen, convenience of customers deletes the operation on the flexible screen of folding fast, promote the use experience of user when using folding cell-phone.
In a second scenario, the flexible screen of the mobile phone is in a folded state, selectable controls (the secondary screen is in a bright screen state) are displayed on the display interface of the secondary screen, and the main screen is in a black screen state, so that a user can delete the selectable controls on the secondary screen in the bright screen state through a double-sided screen gesture operation. The implementation principle and technical effect are similar to those of the first scenario of this embodiment, and are not described herein again.
In a third scenario, the flexible screen of the mobile phone is in a folded state, selectable controls are displayed on the display interfaces of the main screen and the auxiliary screen, and the mobile phone responds to the gesture operation of the double-sided screen to delete the selectable controls on the screen facing the user.
Illustratively, as shown in fig. 16, the flexible screen is currently in a collapsed state, the main screen 11 of the flexible screen faces the user and the main screen 11 currently displays a file list interface comprising 2 folder controls, the secondary screen 12 of the flexible screen faces away from the user and the secondary screen 12 currently displays a chat list interface of WeChat comprising 3 dialog controls. Based on the method, after receiving touch operations of a user on the main screen and the auxiliary screen, the state monitoring service sends the physical form, the display state and the touch data of the current flexible screen to the input management service in the application program framework layer, and also sends the specific orientation of the mobile phone. See the above embodiments for how the condition monitoring service determines the specific orientation of the handset. Furthermore, the input management service can determine which selectable control on the screen to delete according to the physical form, display state, touch data of the flexible screen and the specific orientation of the mobile phone. As shown in fig. 16, the current screen facing the user is the home screen 11, and the mobile phone will delete the folder 2 on the home screen 11.
Illustratively, table 4 shows a response of the mobile phone to determine whether to trigger the control deletion according to the physical form of the flexible screen, the display state, the touch data on the primary screen and the secondary screen, and the specific orientation of the mobile phone.
Figure BDA0002221581560000281
Figure BDA0002221581560000291
The embodiment of the application discloses electronic equipment, which comprises a processor, and a memory, input equipment and output equipment which are connected with the processor. The input device and the output device may be integrated into one device, for example, the touch device of the flexible screen may be used as the input device, and the display of the flexible screen may be used as the output device.
At this time, as shown in fig. 17, the electronic device may include: a flexible screen 1701, the flexible screen 1701 comprising a touch device 1706 and a display 1707; one or more processors 1702; one or more memories 1703; one or more sensors 1708; the memory 1703 stores one or more application programs (not shown) and one or more programs 1704, which may communicate over one or more communication buses 1705. Wherein the one or more programs 1704 are stored in the memory 1703 and configured to be executed by the one or more processors 1702 to cause the electronic device to perform the various steps in the embodiments described above. All relevant contents of the steps related to the above method embodiment may be referred to the functional description of the corresponding entity device, and are not described herein again.
For example, the processor 1702 may specifically be the processor 110 shown in fig. 1, the memory 1703 may specifically be the internal memory 121 and/or the external memory 120 shown in fig. 1, the flexible screen 1701 may specifically be the display screen 301 shown in fig. 1, and the sensor 1708 may specifically be the gyroscope sensor 180B, the acceleration sensor 180E, the proximity light sensor 180G, and the infrared sensor in the sensor module 180 shown in fig. 1, which is not limited in this embodiment of the present application.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (26)

1. A control method of an electronic device having a flexible screen, wherein a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, the method comprising:
when the flexible screen is in a bright screen state, the electronic equipment detects double-sided screen gesture operation for triggering screen capture, the double-sided screen gesture operation comprises first operation on the first screen and second operation on the second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation;
responding to the double-sided screen gesture operation, and the electronic equipment conducts screen capture on the flexible screen in the bright screen state.
2. The method of claim 1, wherein the electronic device screenshots the flexible screen in a bright screen state in response to the double-sided screen gesture operation, comprising:
if the first screen is in a bright screen state and the second screen is in a black screen state, responding to the double-sided screen gesture operation, and performing screen capture on the first screen by the electronic equipment;
if the first screen is in a black screen state and the second screen is in a bright screen state, responding to the double-sided screen gesture operation, and performing screen capture on the second screen by the electronic equipment;
and if the first screen and the second screen are both in a bright screen state, responding to the gesture operation of the double-sided screen, and carrying out screen capture on the whole flexible screen by the electronic equipment.
3. The method of claim 1 or 2, wherein after the electronic device performs the screenshot on the flexible screen in the bright screen state, further comprising: displaying a screen capture preview interface; the screen capture preview interface comprises at least one of the following processing controls: save controls, edit controls, share controls, or cancel controls.
4. The method of claim 3, wherein displaying the screenshot preview interface comprises:
if the first screen is in a bright screen state, the second screen is in a black screen state, and the screen capture preview interface is displayed on the first screen;
if the first screen is in a black screen state, the second screen is in a bright screen state, and the screen capture preview interface is displayed on the second screen;
and if the first screen and the second screen are both in a bright screen state, displaying the screen capture preview interface on a screen facing a user, wherein the screen facing the user is the first screen or the second screen.
5. The method of claim 3 or 4, further comprising, after displaying the screenshot preview interface:
the electronic equipment detects a third operation for triggering selection of one processing control on the screen capture preview interface;
and responding to the third operation, and executing the execution processing corresponding to the selected processing control on the screenshot by the electronic equipment.
6. The method according to any of claims 1-5, wherein the first operation and the second operation are of the same operation type, the operation type comprising any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
7. A control method of an electronic device having a flexible screen, wherein a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, the method comprising:
when at least two selectable controls are displayed on a display interface of the first screen, the electronic equipment detects double-sided screen gesture operation for triggering multiple selections, the double-sided screen gesture operation comprises first operation on the first screen and second operation on the second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, and the touch position of the first operation corresponds to the touch position of the second operation;
in response to the double-sided screen gesture operation, the electronic device controls the display interface of the first screen to enter a multi-selection mode, wherein in the multi-selection mode, the at least two selectable controls on the display interface of the first screen can be selected.
8. The method of claim 7, wherein the electronic device controls the display interface of the first screen to enter a multiple selection mode in response to the double-sided screen gesture operation, comprising:
responding to the double-sided screen gesture operation, and if the first screen is a screen facing a user, controlling a display interface of the first screen to enter a multi-selection mode by the electronic equipment.
9. The method of claim 7, wherein when at least two selectable controls are displayed on the display interface of the first screen and at least two selectable controls are displayed on the display interface of the second screen, the electronic device controls the display interface of the user-facing screen to enter a multi-selection mode in response to the double-sided screen gesture operation, the user-facing screen being the first screen or the second screen.
10. The method according to any one of claims 7-9, wherein the electronic device controls the display interface of the first screen to enter a multiple-selection mode in response to the double-sided screen gesture operation, including:
and if the touch position of the first operation on the first screen corresponds to a first selectable control, the first selectable control is one selectable control displayed on the first screen, and in response to the double-sided screen gesture operation, the electronic equipment controls the display interface of the first screen to enter a multi-selection mode and displays the first selectable control as a selected one.
11. The method according to any one of claims 7-10, wherein after the electronic device controls the display interface of the first screen to enter a multiple selection mode in response to the double-sided screen gesture operation, the method further comprises:
the electronic equipment detects a third operation for triggering selection of a selectable control on a display interface of the first screen, wherein the operation type of the third operation is click operation or sliding operation;
in response to the third operation, the electronic device displays one or more controls corresponding to the third operation as selected.
12. The method according to any of claims 7-11, wherein the first operation and the second operation are of the same operation type, the operation type comprising any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
13. A control method of an electronic device having a flexible screen, wherein a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, the method comprising:
when the display interfaces of the first screen and the second screen are both interfaces capable of containing icon controls and the icon controls are displayed on the display interface of the first screen, the electronic equipment detects double-sided screen gesture operation for triggering icon movement, the double-sided screen gesture operation comprises first operation on the first screen and second operation on the second screen, the operation time difference between the first operation and the second operation is smaller than a preset time difference, the touch position of the first operation corresponds to the touch position of the second operation, and the first operation corresponds to the first icon control on the first screen;
in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to the second screen.
14. The method of claim 13, wherein the moving, by the electronic device, the first icon control from the first screen to the second screen in response to the double-sided screen gesture operation comprises:
responding to the double-sided screen gesture operation, and if the first screen is a screen facing a user, the electronic equipment moves the first icon control from the first screen to the second screen.
15. The method of claim 13, wherein when the display interfaces of the first screen and the second screen are both interfaces capable of accommodating icon controls, at least one icon control is displayed on the display interface of the first screen, and at least one icon control is displayed on the display interface of the second screen,
responding to the double-sided screen gesture operation if the first operation corresponds to a first icon control on the display interface of the first screen and the second operation corresponds to a second icon control on the display interface of the second screen, and if the first screen faces the user, the electronic equipment moves the first icon control from the first screen to the second screen; and if the second screen is directed to the user, the electronic equipment moves the second icon control from the second screen to the first screen.
16. The method according to any one of claims 13-15, wherein when the display interfaces of the first screen and the second screen are both interfaces capable of accommodating icon controls, at least one icon control is displayed on the display interface of the first screen, and at least one icon control is displayed on the display interface of the second screen,
in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to the front of the at least one icon control on the second screen and sequentially moves the at least one icon control on the second screen backwards;
alternatively, the first and second electrodes may be,
in response to the double-sided screen gesture operation, the electronic device moves the first icon control from the first screen to behind at least one icon control on the second screen.
17. The method of any of claims 13-15, wherein the electronic device moving the first icon control from the first screen to the second screen comprises:
the electronic device moves the first icon control from the first screen to a touch location of the second operation of the second screen.
18. The method of claim 17, wherein if a third icon control is located at the touch position of the second screen, the moving, by the electronic device, the first icon control from the first screen to the touch position of the second operation of the second screen comprises:
the electronic equipment moves the first icon control from a first screen to a touch position of the second operation of the second screen, and moves the third icon control on the second screen backwards; or
The electronic device moves the first icon control from a first screen to a touch position of the second operation of the second screen, and merges the third icon control and the first icon control on the second screen.
19. The method according to any of claims 13-18, wherein the first operation and the second operation are of the same operation type, the operation type comprising any of: pressing operation, clicking operation, double-clicking operation and long-pressing operation.
20. A control method of an electronic device having a flexible screen, wherein a physical form of the flexible screen includes an unfolded state and a folded state, and when the flexible screen is in the folded state, the flexible screen is divided into a first screen and a second screen, the method comprising:
when a selectable control is displayed on a display interface of the first screen, the electronic device detects a double-sided screen gesture operation for triggering deletion of the control, the double-sided screen gesture operation includes a first operation on the first screen and a second operation on the second screen, an operation time difference between the first operation and the second operation is smaller than a preset time difference, a touch position of the first operation corresponds to a touch position of the second operation, the first operation corresponds to the first selectable control on the first screen, operation types of the first operation and the second operation are sliding operations, and sliding directions are consistent;
in response to the double-sided screen gesture operation, the electronic device deletes the first selectable control on the first screen.
21. The method of claim 20, wherein in response to the double-sided screen gesture operation, the electronic device deleting the first selectable control on the first screen, comprising:
responding to the double-sided screen gesture operation, and if the first screen is a screen facing a user, deleting the first selectable control on the first screen by the electronic equipment.
22. The method according to claim 20, wherein when the display interface of the first screen displays a selectable control and the display interface of the second screen displays a selectable control, if the first operation corresponds to the first selectable control on the first screen and the second operation corresponds to the second selectable control on the second screen;
responding to the double-sided screen gesture operation, and if the first screen faces a user, deleting the first selectable control on the first screen by the electronic equipment; and if the second screen is displayed to the user, the electronic equipment deletes the second selectable control on the second screen.
23. The method of any of claims 20-22, wherein, after the electronic device deleting the first selectable control on the first screen in response to the double-sided screen gesture operation, further comprising:
and if one or more third selectable controls are arranged below the first selectable control, the electronic equipment moves the one or more third selectable controls upwards in sequence.
24. An electronic device, comprising:
the flexible screen comprises a display and a touch device, the physical form of the flexible screen comprises an unfolding state and a folding state, and when the flexible screen is in the folding state, the flexible screen is divided into a main screen and an auxiliary screen;
one or more processors;
one or more memories;
one or more sensors;
the memory stores one or more application programs and one or more programs, wherein the one or more programs include instructions that, when executed by the electronic device, cause the electronic device to perform the control method of any of claims 1-23.
25. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the control method of any one of claims 1-23.
26. A computer program product comprising instructions for causing an electronic device to perform the control method of any one of claims 1-23 when the computer program product is run on the electronic device.
CN201910935961.4A 2019-09-29 2019-09-29 Control method of electronic equipment with flexible screen and electronic equipment Pending CN112578981A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910935961.4A CN112578981A (en) 2019-09-29 2019-09-29 Control method of electronic equipment with flexible screen and electronic equipment
PCT/CN2020/116714 WO2021057699A1 (en) 2019-09-29 2020-09-22 Method for controlling electronic device with flexible screen, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910935961.4A CN112578981A (en) 2019-09-29 2019-09-29 Control method of electronic equipment with flexible screen and electronic equipment

Publications (1)

Publication Number Publication Date
CN112578981A true CN112578981A (en) 2021-03-30

Family

ID=75111130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910935961.4A Pending CN112578981A (en) 2019-09-29 2019-09-29 Control method of electronic equipment with flexible screen and electronic equipment

Country Status (2)

Country Link
CN (1) CN112578981A (en)
WO (1) WO2021057699A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047870A (en) * 2021-11-15 2022-02-15 珠海读书郎软件科技有限公司 Screen capturing method, storage medium and equipment of double-screen watch
CN114756165A (en) * 2022-04-24 2022-07-15 维沃移动通信有限公司 Equipment control method and device
WO2023116521A1 (en) * 2021-12-21 2023-06-29 维沃移动通信有限公司 Control method and apparatus for electronic device, and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016031755A (en) * 2014-07-25 2016-03-07 上海逗屋▲網▼絡科技有限公司 Touch control method and apparatus for multi-touch terminal
CN109981839A (en) * 2019-02-02 2019-07-05 华为技术有限公司 A kind of display methods and electronic equipment of the electronic equipment with flexible screen
CN110262690A (en) * 2019-06-18 2019-09-20 Oppo广东移动通信有限公司 Double-screen display method and device, mobile terminal, computer readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102064965B1 (en) * 2013-01-04 2020-01-10 엘지전자 주식회사 Method for controlling using double touch jesture and the terminal thereof
CN109358793A (en) * 2018-09-27 2019-02-19 维沃移动通信有限公司 A kind of screenshotss method and mobile terminal
CN109542325B (en) * 2018-11-29 2024-03-29 努比亚技术有限公司 Double-sided screen touch method, double-sided screen terminal and readable storage medium
CN109710168B (en) * 2018-12-27 2021-07-23 努比亚技术有限公司 Screen touch method and device and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016031755A (en) * 2014-07-25 2016-03-07 上海逗屋▲網▼絡科技有限公司 Touch control method and apparatus for multi-touch terminal
CN109981839A (en) * 2019-02-02 2019-07-05 华为技术有限公司 A kind of display methods and electronic equipment of the electronic equipment with flexible screen
CN110262690A (en) * 2019-06-18 2019-09-20 Oppo广东移动通信有限公司 Double-screen display method and device, mobile terminal, computer readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047870A (en) * 2021-11-15 2022-02-15 珠海读书郎软件科技有限公司 Screen capturing method, storage medium and equipment of double-screen watch
WO2023116521A1 (en) * 2021-12-21 2023-06-29 维沃移动通信有限公司 Control method and apparatus for electronic device, and device
CN114756165A (en) * 2022-04-24 2022-07-15 维沃移动通信有限公司 Equipment control method and device

Also Published As

Publication number Publication date
WO2021057699A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
US11722449B2 (en) Notification message preview method and electronic device
CN109981839B (en) Display method of electronic equipment with flexible screen and electronic equipment
CN110489043B (en) Management method and related device for floating window
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN112671976B (en) Control method and device of electronic equipment, electronic equipment and storage medium
WO2021036571A1 (en) Desktop editing method and electronic device
CN110602315B (en) Electronic device with foldable screen, display method and computer-readable storage medium
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN110764673A (en) Method for scrolling screen capture and electronic equipment
CN112751954B (en) Operation prompting method and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
WO2021057699A1 (en) Method for controlling electronic device with flexible screen, and electronic device
CN110806831A (en) Touch screen response method and electronic equipment
US20230205417A1 (en) Display Control Method, Electronic Device, and Computer-Readable Storage Medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN112449101A (en) Shooting method and electronic equipment
CN114816200A (en) Display method and electronic equipment
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
WO2021037034A1 (en) Method for switching state of application, and terminal device
CN111142767B (en) User-defined key method and device of folding device and storage medium
WO2024046179A1 (en) Interaction event processing method and apparatus
US20220317841A1 (en) Screenshot Method and Related Device
WO2022217969A1 (en) Method and apparatus for enabling function in application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210330

RJ01 Rejection of invention patent application after publication