WO2021223560A1 - Procédé de commande d'état d'écran et dispositif électronique - Google Patents

Procédé de commande d'état d'écran et dispositif électronique Download PDF

Info

Publication number
WO2021223560A1
WO2021223560A1 PCT/CN2021/085662 CN2021085662W WO2021223560A1 WO 2021223560 A1 WO2021223560 A1 WO 2021223560A1 CN 2021085662 W CN2021085662 W CN 2021085662W WO 2021223560 A1 WO2021223560 A1 WO 2021223560A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
user
sensing area
preset
electronic device
Prior art date
Application number
PCT/CN2021/085662
Other languages
English (en)
Chinese (zh)
Inventor
涂永峰
付滇
胡燕
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021223560A1 publication Critical patent/WO2021223560A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • This application relates to the technical field of smart terminals, and in particular to a method for controlling the state of a screen and an electronic device.
  • folding screen technology has initially become a reality.
  • the size of the touch screen continues to expand, and the frame continues to shrink, when the user opens or closes the touch screen of an electronic device with a foldable touch screen, such as a mobile phone, a tablet computer, etc., if the touch The screen is in an activated state. Because the user has a large contact area with the touch screen and frequent contacts, it is easy to cause false touch operations on the touch screen.
  • the present application provides a method for controlling a screen state and an electronic device, which can reduce the user's mistaken touch operations on the touch screen when the user opens and closes the touch screen in the active state.
  • this application provides a method for controlling the screen state, including:
  • the opening and closing state of the foldable touch screen begins to change, and the user's operation is detected in the preset sensing area, and the touch screen is set to the disabled state.
  • the disabled state is the touch operation of the touch screen to the user Not responding
  • the effective time of the operation ends or the change of the opening and closing state ends, the disabling state of the touch screen ends, and the effective time of the operation is the duration of the disabling state preset for the operation.
  • the user's operations include but are not limited to: pressing, single-finger touch, two-finger touch, double-click, pressing for a preset time, single-finger touch for a preset time, or two-finger touch for a preset time, etc.
  • the method for controlling the screen state of the present application can reduce the user's erroneous touch operations on the touch screen when the user opens and closes the touch screen in the active state.
  • Electronic devices can include mobile terminals (mobile phones), wearable devices, smart screens, drones, and intelligent connected vehicles (Intelligent Connected Vehicles; hereinafter referred to as: ICV), smart/intelligent car (smart/intelligent car) or in-vehicle equipment.
  • ICV Intelligent Connected Vehicles
  • smart/intelligent car smart/intelligent car
  • in-vehicle equipment smart/intelligent car
  • the preset sensing area includes: a physical sensing area, and/or a virtual sensing area; the physical sensing area is set outside the touch screen and can perform user operations The physical area to be detected, and the virtual sensing area is the area on the touch screen.
  • the physical sensing area may be arranged on the side of the electronic device, and/or the upper surface of the electronic device, and/or the lower surface of the electronic device, and/or the back of the electronic device, and/or the front of the electronic device, etc.
  • the physical sensing area is preferably set in an area that conforms to the user's usage or holding habits.
  • the virtual sensing area may be set at the edge position of the touch screen. It should be noted that the virtual sensing area can be hidden under normal conditions, and it is not perceivable by the user when the touch screen is normally displayed. The virtual sensing area will only perform sensing when the opening and closing state of the touch screen changes. Function to detect whether the user performs a preset operation in the virtual sensing area.
  • the preset sensing area includes: a virtual sensing area. After the opening and closing state of the foldable touch screen starts to change, it also includes:
  • the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the first virtual sensing area.
  • This implementation can distinguish the virtual sensing area from the UI control, so that the electronic device can distinguish whether the user's operation is for the virtual sensing area or the UI control.
  • the preset method of the virtual sensing area includes:
  • presenting the drawing interface to the user includes:
  • the user When showing the user a drawing interface, show the user a heat map or an area map.
  • the heat map or the area map can be displayed to the user as the background image of the drawing interface.
  • the preset method of the virtual sensing area includes:
  • a preset number of sub-areas are selected as the virtual sensing area in the order of the number of times from high to low.
  • the preset sensing area includes: a virtual sensing area. After the opening and closing state of the foldable touch screen starts to change, it also includes:
  • the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the virtual sensing area corresponding to the display scene.
  • This implementation manner can distinguish the virtual sensing area from the UI control in the display scene, so that the electronic device can distinguish whether the user's operation is for the virtual sensing area or the UI control.
  • the touch screen after the touch screen is set to the disabled state, it also includes:
  • the preset user interface UI effect is displayed on the touch screen, and the preset UI effect is used to remind the user that the touch screen is in the disabled state.
  • the touch screen after the touch screen is set to the disabled state, it also includes:
  • the UI effect is kept displayed on the touch screen.
  • it also includes:
  • this application provides an electronic device, including:
  • Touch screen ; one or more processors; memory; and one or more computer programs, one or more computer programs are stored in the memory, one or more computer programs include instructions, when the instructions are executed by the device, Make the device perform the following steps:
  • the opening and closing state of the foldable touch screen begins to change, and the user's operation is detected in the preset sensing area, and the touch screen is set to the disabled state.
  • the disabled state is the touch operation of the touch screen to the user Not responding
  • the effective time of the operation ends or the change of the opening and closing state ends, the disabling state of the touch screen ends, and the effective time of the operation is the duration of the disabling state preset for the operation.
  • the preset sensing area includes: a physical sensing area, and/or a virtual sensing area; the physical sensing area is set outside the touch screen and can perform user operations The physical area to be detected, and the virtual sensing area is the area on the touch screen.
  • the preset sensing area includes: a virtual sensing area.
  • the step of causing the device to perform the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the first virtual sensing area.
  • the device when the instruction is executed by the device, the device is caused to perform the following steps to preset the virtual sensing area:
  • the step of causing the device to perform the display of the drawing interface to the user includes:
  • the device when the instruction is executed by the device, the device is caused to perform the following steps to preset the virtual sensing area:
  • a preset number of sub-areas are selected as the virtual sensing area in the order of the number of times from high to low.
  • the preset sensing area includes: a virtual sensing area.
  • the step of causing the device to perform the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the virtual sensing area corresponding to the display scene.
  • the preset user interface UI effect is displayed on the touch screen, and the preset UI effect is used to remind the user that the touch screen is in the disabled state.
  • the UI effect is kept displayed on the touch screen.
  • the device when the instruction is executed by the device, the device further executes the following steps:
  • the present application provides a computer-readable storage medium in which a computer program is stored, which when run on a computer, causes the computer to execute the method of any one of the above-mentioned first aspects.
  • the present application provides a computer program, when the computer program is executed by a computer, it is used to execute any one of the methods in the first aspect.
  • the program in the fourth aspect may be stored in whole or in part on a storage medium packaged with the processor, or may be stored in part or in a memory not packaged with the processor.
  • FIG. 1 is an example diagram of an electronic device folded inward and folded outward according to an embodiment of the application
  • 2A is a flowchart of an embodiment of a method for controlling the screen state of this application
  • 2B is an example diagram of the relationship between the foldable touch screen and the rotating mechanism according to the embodiment of the application.
  • 2C is an example diagram of a user's operation during the change of the opening and closing state of the electronic device according to the embodiment of the application;
  • FIG. 3A is a flowchart of another embodiment of a method for controlling the screen state of this application.
  • 3B to 3E are diagrams showing examples of setting positions of virtual sensing areas according to an embodiment of the application.
  • 3F is an example diagram of possible finger positions during the folding process according to the embodiment of this application.
  • 4A to 4E are diagrams showing examples of UI effects according to embodiments of the application.
  • Fig. 5 is an example diagram of the opening and closing state of the embodiment of the application changing in the opposite direction to the current change
  • FIG. 6A is a flowchart of another embodiment of the method for controlling the screen state of this application.
  • 6B to 6D are diagrams showing examples of the position of the virtual sensing area according to an embodiment of the application.
  • 6E is an example diagram of a virtual sensing area setting interface according to an embodiment of the application.
  • FIG. 7A is a flowchart of another embodiment of the method for controlling the screen state of this application.
  • FIG. 7B is an example diagram of user habits in an embodiment of the application.
  • FIG. 7C is a diagram of an example drawing interface of an embodiment of the application.
  • FIG. 8A is a flowchart of another embodiment of a method for controlling a screen state of this application.
  • FIG. 8B is an example diagram showing a heat map on a drawing interface according to an embodiment of the application.
  • FIG. 8C is an example diagram of sub-areas division of a touch screen according to an embodiment of the application.
  • FIG. 8D is an example diagram of the display area diagram of the drawing interface according to the embodiment of the application.
  • FIG. 9 is a flowchart of another embodiment of the method for controlling the screen state of this application.
  • FIG. 10A is a flowchart of another embodiment of a method for controlling a screen state of this application.
  • 10B is an example diagram of UI icons and positions of virtual sensing areas on the interface according to an embodiment of the application
  • FIG. 11 is a flowchart of another embodiment of a method for controlling a screen state of this application.
  • FIG. 12 is a structural diagram of an embodiment of a device for controlling screen states of this application.
  • FIG. 13 is a schematic structural diagram of an embodiment of an electronic device of this application.
  • An electronic device with a touch screen specifically refers to an electronic device whose display screen can be touched.
  • the touch screen in the electronic device can detect a user's finger or other objects placed on its surface, and can identify the location of the finger or object.
  • An electronic device with a foldable touch screen specifically refers to an electronic device with a foldable touch screen.
  • the foldable touch screen in the electronic device can adopt an integrated flexible screen, or it can adopt multiple flexible screens and
  • the splicing screen composed of hinges between every two flexible screens can use multiple rigid screens and a splicing screen composed of one flexible screen located between each two rigid screens, or multiple rigid screens and one flexible screen located between each two
  • the splicing screen composed of hinges between rigid screens, etc., is not limited in the embodiment of the present application.
  • the folding of the touch screen can include the inward folding of the touch screen on the inside after folding, and the outward folding of the touch screen on the outside after folding. Folded form.
  • the folding of the electronic device and the folding of the touch screen are generally the same, as shown in FIG. 1 for example.
  • Electronic devices with a foldable touch screen have different states based on whether the touch screen is folded. When it is folded, it is called the fully folded state (see part 13 or 16 in Figure 1).
  • changing the folding angle of the touch screen from the unfolded state to the fully folded state is referred to as closing of the touch screen.
  • the closing of the touch screen can be from the unfolded state of the touch screen to the fully folded state, as shown in parts 11 to 13 in FIG. 1, or as shown in parts 14 to 16 in FIG. 1, or it can be this way.
  • the change of a certain folding angle upward for example, from part 11 to part 12 in Fig. 1, or from part 12 to part 13 in Fig. 1, or from part 14 to part 15 in Fig. 1, or from Fig. 1 Part 15 changes to part 16 and so on.
  • the change in the folding angle of the touch screen from the fully folded state to the unfolded state is referred to as opening of the touch screen. Similar to the closing of the touch screen, the opening of the touch screen can be from a fully folded state to an unfolded state, or it can be a change of a certain folding angle in this direction.
  • the embodiments of the present application are collectively referred to as a change in the opening and closing state of the touch screen.
  • the embodiments of the present application provide a method for controlling the screen state and an electronic device, which can reduce the user's erroneous touch operations on the touch screen when the user opens or closes the electronic device with the touch screen in an active state.
  • the embodiments of the present application can be applied to electronic devices with foldable touch screens, and the electronic devices can include, but are not limited to, mobile phones, tablet computers, and the like.
  • FIG. 2A is a flowchart of an embodiment of a method for controlling the screen state of this application. As shown in FIG. 2A, the method for controlling the screen state may include:
  • Step 201 The electronic device detects that the opening and closing state of the touch screen starts to change, and the user's operation is detected in the preset sensing area, and the touch screen is set to the disabled state, and the disabled state is the touch screen pair The user's touch operation does not respond.
  • the foldable part of the touch screen generally has a rotating mechanism to support the opening or closing of the touch screen.
  • the rotating structure of the splicing screen can be The above-mentioned hinge, when the rotating structure generates movement, indicates that the opening and closing state of the foldable touch screen has changed, that is, the opening and closing state of the electronic device has changed.
  • the electronic device can detect whether the opening and closing state of the touch screen has changed by detecting whether the rotation mechanism of the touch screen produces movement, for example, a Hall sensor can be used to detect the rotation mechanism. Whether to produce movement.
  • detecting the user's operation in the preset sensing area is generally performed during the process of changing the opening and closing state.
  • the preset sensing area is an area that can detect the user's operation.
  • the preset sensing area may include: a physical sensing area, and/or a virtual sensing area; the physical sensing area is a physical area set outside the touch screen and capable of detecting user operations, and the virtual sensing The area is the area on the touch screen.
  • the user's operation may include, but is not limited to: pressing, single-finger touch, two-finger touch, double-tapping, pressing for a preset time, single-finger touch for a preset time, or two-finger touch for a preset time, etc.
  • Step 202 When the effective time of the user's operation ends or the change of the opening and closing state of the touch screen is terminated, the disabling state of the touch screen is ended.
  • the touch screen when the effective time of the user's operation ends or the change of the opening and closing state of the touch screen is terminated, the touch screen is in the inactive state after the disabling state set for the change of the opening and closing state of the touch screen is ended.
  • the embodiment of the application does not limit the enabled state or the active state. For example, it can be set according to the state of the touch screen when the change of the opening and closing state of the touch screen is terminated. For example, if the change of the opening and closing state of the touch screen is terminated, , The touch screen is in the fully folded state after being folded inward, see part 16 in Figure 1. At this time, the user cannot use the touch screen, and the touch screen can still be set to the disabled state.
  • the touch screen is in any of the states shown in parts 11 to 15 in FIG. 1, and the user can continue to use the touch screen, then the touch screen can be set to an active state.
  • the above-mentioned disabling state of the touch screen can be ended by setting the touch screen to an activated state or other states.
  • the activated state and the disabled state are two relative states, and the activated state is that the touch screen responds to the user's touch operation.
  • this step includes at least the following three situations: one situation is that the effective time of the user's operation ends, but the change of the opening and closing state is not terminated, and the disabling state of the touch screen ends; the other situation is the user's operation The effective time of is not over, but the change of the opening and closing state is terminated, and the disabling state of the touch screen is ended; the third case is that the effective time of the user's operation ends and the change of the opening and closing state is terminated, and the inactivity of the touch screen is ended. Enable state.
  • the effective time of the user's operation refers to the duration of the disabled state preset for the user's operation.
  • the effective time of the user's operation can be: the time during which the user performs the operation, that is, The electronic device detects the user's operation in the preset sensing area during the change of the opening and closing state of the touch screen, sets the touch screen to the disabled state, detects the end of the user's operation, and ends the operation of the touch screen. Disabled state. From the user's point of view, the user performs an operation in the preset sensing area, the touch screen is disabled, the user's operation in the preset sensing area ends, and the disabling state of the touch screen also ends.
  • the entire change process of the opening and closing state of the touch screen is from the unfolded state to the fully folded state, as shown in section 21
  • the user presses the preset sensing area 211.
  • the user does not press the preset sensing area.
  • the effective time of the user's operation It can be: a preset time period after the end of the user's operation, or from the end of the user's operation to the end of the change of the opening and closing state, etc.
  • the electronic device detects the user's operation in the preset sensing area in the process of the change of the opening and closing state, and touches The screen is set to the disabled state until the change of the opening and closing state is terminated, and the touch screen is set to the active state. From the user's point of view, the user performs an operation in the preset sensing area, and the touch screen is not enabled until the opening and closing state change ends.
  • the entire change process of the opening and closing state of the touch screen is from the unfolded state to the fully folded state, and it is folded from the unfolded state shown in section 21 to the fully folded state.
  • the user presses the preset sensing area 211 to fold from the unfolded state shown in section 21 to the folding angle shown in section 22.
  • the duration of the process is 1.5s;
  • the user does not press the preset sensing area 211 when the folding angle shown in the figure is folded to the fully folded state; then, when the electronic device is folded from the unfolded state shown in section 21 to the folding angle shown in section 22,
  • the preset sensing area detects the user's pressing operation, and the pressing duration is 1.5s, and the touch screen is set to the disabled state; the electronic device is in the process of folding from the folding angle shown in section 22 to the fully folded state , The user's pressing operation is not detected in the preset sensing area, and the electronic device still sets the touch screen to the disabled state.
  • the embodiment of the present application may further include:
  • a preset user interface UI effect is displayed on the touch screen, and the preset UI effect is used to prompt the user that the touch screen is in a disabled state.
  • the aforementioned UI effect may be kept displayed on the touch screen.
  • the process triggered by the user in the current change process is ended .
  • the electronic device detects a change in the opening and closing state of the foldable touch screen.
  • the touch screen on the electronic device is set to the disabled state.
  • the disabled state of the touch screen is ended, so that the user can open or close the touch screen.
  • Fig. 3A is a flowchart of another embodiment of the method for controlling the screen state of this application. As shown in Fig. 3A, the method for controlling the screen state may include:
  • Step 301 Pre-set a physical sensing area on the electronic device.
  • the physical sensing area is generally set on the electronic device when the electronic device is manufactured.
  • the physical sensing area may be set on the non-touch screen area of the electronic device; for example, as shown in FIG. 3B and FIG. 3C, the physical sensing area may be set on the left side of the electronic device and/or the electronic device The right side, and/or the upper side of the electronic device, and/or the lower side of the electronic device, and/or the back of the electronic device, and/or the front of the electronic device, etc.
  • the front of the electronic device is the surface where the touch screen is located.
  • the position, shape, and size of the entity sensing area in FIG. 3B and FIG. 3C are only examples, and are not used to limit the entity sensing area in the embodiment of the present application.
  • the number of physical sensing areas can be one or more.
  • the physical sensing area can be set on the front of the electronic device, any one or more of the six areas close to the touch screen, and/or the physical sensing area can be set on Any one or more of the 3 areas on the left side of the electronic device, and/or the physical sensing area can be set in any one or more of the 3 areas on the right side of the electronic device, And/or, the physical sensing area may be set on any one or more of the three areas on the upper side of the electronic device, and/or the physical sensing area may be set on the three lower sides of the electronic device Any one or more of the areas, and/or, as shown in FIG. 3C, the physical sensing area can be set in any one or more of the eight areas on the back of the electronic device.
  • the number of physical sensing areas may be one or two in consideration of cost and simple design style.
  • the number of physical sensing regions 211 is set to one and arranged on the upper side of the electronic device, thereby saving cost and ensuring a simple design style on the front of the electronic device.
  • the physical sensing area may be set in an area that conforms to the user's usage habit or holding habit. For example, as shown in Figure 3E, if the user is used to holding the right thumb in the middle of the right side of the electronic device during the opening and closing state of the touch screen, the physical sensing area can be set on the front of the electronic device and close to The middle area 212 of the touch screen area.
  • the setting of the physical sensing area may also consider whether the electronic device is folded inwardly or outwardly. For example, if the electronic device is folded inward, the physical sensing area can be set on the back of the electronic device. If the electronic device is folded outward, the physical sensing area can be set on the front of the electronic device.
  • the above settings can prevent continuous pressing If the touch screen is disabled only in the physical sensing area, the angle of the touch screen is too small during the folding process. As shown in Figure 3F, the finger pressing on the physical sensing area has to leave before the folding is completed. Affect the setting effect of the physical sensing area.
  • the physical sensing area may be a push switch, or a button, or a touch-sensitive area.
  • the physical sensing area can be combined with a fingerprint recognizer, so that the electronic device can incorporate the fingerprint authentication process in the subsequent steps. Increase the safety of electronic equipment.
  • Step 302 The electronic device detects whether the opening and closing state of the touch screen has changed, and if there is a change, execute step 303; otherwise, continue to execute step 302.
  • Step 303 During the change of the opening and closing state of the touch screen, the electronic device detects the user's operation in the preset physical sensing area, and if the user's operation is detected, execute step 304; if the user's operation is not detected , Go to step 305.
  • step 304 can generally be performed as long as the user's operation is detected in one physical sensing area; however, in actual applications, it can also be preset to be two or more. Only when a user's operation is detected in each physical sensing area, it is determined that the user's operation is detected in the physical sensing area, and step 304 is executed, which is not limited in the embodiment of the present application.
  • Step 304 The electronic device sets the touch screen to a disabled state until the effective time of the user's operation ends or the change of the opening and closing state of the touch screen is terminated; and the preset user interface (User Interface) is displayed on the touch screen.
  • the branch process ends until the end of the disabling state of the touch screen, or until the preset time period after the end of the disabling state of the touch screen ends.
  • the touch screen can be set to the active state.
  • the preset UI effect is used to prompt the user that the touch screen is in a disabled state.
  • the limitation function of the preset time until the end of the disabling state of the touch screen or the end of the disabling state of the touch screen is that the display of the UI effect can be when the disabling state of the touch screen ends It disappears immediately, or it may disappear after a preset time period.
  • the specific value of the preset time period is not limited in the embodiment of the application.
  • the UI effect may include but is not limited to:
  • the pop-up notification message effect and the corresponding notification message disappearance effect for example, as shown in Figure 4A, a notification message pops up in the middle of the touch screen, and after a certain period of time, the notification message disappears;
  • the ripple effect on the water surface and the corresponding ripple disappearance effect for example, as shown in Figure 4B, the ripple effect gradually spreads from the middle of the touch screen to the surroundings. After a certain period of time, the ripple effect gradually shrinks to the middle and disappears;
  • the lock icon effect and the corresponding lock icon disappear effect can be: display the locked lock in one or more positions of the touch screen;
  • the lock icon disappear effect can be: the lock icon is directly Disappear, or display an open lock in one or more positions of the touch screen;
  • the position where the lock appears can include but is not limited to: beside, corner, center, and/or where the user’s finger touches the touch screen Location, etc.; for example, as shown in Figure 4C, a lock icon appears in the middle of the touch screen, and after a certain period of time, the lock icon disappears; or, as shown in Figure 4D, in the lower left and upper right corners of the touch screen Display the locked lock. After a certain period of time, the two corners of the bottom left corner and the upper right corner of the touch screen will display the opened lock. After a certain period of time, the opened lock disappears;
  • the simulated drawer pull-out effect and the corresponding simulated drawer close effect can be: pull out a semi-transparent bar frame from both sides of the touch screen; correspondingly, The simulated drawer retracting effect can be: retracting the semi-transparent bar frame from the edges on both sides; the semi-transparent bar frame is displayed as a bar frame filled with diagonal lines in FIG. 4E.
  • the UI effect when the aforementioned UI effect is displayed on the touch screen, the UI effect can be displayed on the entire touch screen, such as shown in FIG. 4B and FIG. As shown in FIG. 4C and FIG. 4D, the embodiment of the present application is not limited.
  • Step 305 During the process of changing the opening and closing state of the touch screen, if the electronic device detects that the opening and closing state of the touch screen changes in the opposite direction to the current change, the process of changing the current opening and closing state ends User-triggered process.
  • the opening and closing of the touch screen are two processes with opposite directions of change.
  • the electronic device detects that the opening and closing state of the touch screen has a change opposite to the current change direction, it means that it detects that the touch screen changes from open to closed, or from closed to open.
  • the electronic device detects that the opening and closing state of the touch screen changes in the opposite direction to the current change.
  • the electronic device detecting that the opening and closing state of the touch screen changes in the opposite direction to the current change direction may include: The electronic device detects that the rotating mechanism of the touch screen produces a movement in the opposite direction.
  • step 303 may also include: if the electronic device detects the user's operation, obtain the user's fingerprint in the physical sensing area, and compare the user's fingerprint with a pre-stored fingerprint. Correspondingly, if the electronic device detects the user's operation And when the comparison results are consistent, step 304 is executed; otherwise, step 305 is executed. Among them, how to obtain the user's fingerprint and how to perform fingerprint comparison will not be described in detail in this application. Through fingerprint comparison, it can be ensured that the folding operation and the disabling operation for the physical sensing area are all executed by the owner of the electronic device, which improves the security of the electronic device.
  • the virtual sensing area may also be preset in the method for controlling the screen state of the embodiment of the present application shown in FIG. 6A; Please refer to FIG. 3A and related descriptions for the flow of steps, which will not be repeated here.
  • the main difference is that the physical sensing area is replaced with a virtual sensing area.
  • the virtual sensing area in the embodiment of the present application is located on the touch screen.
  • the virtual sensing area can be set in any area of the touch screen, and the size, shape, and number of the virtual sensing area are not limited in the embodiment of the present application.
  • the virtual sensing area is hidden in the normal state, and it may not be perceived by the user when the touch screen is displayed normally, and the electronic device will not detect in the virtual sensing area and the touch screen is not enabled.
  • the electronic device For the user’s operation, only when the electronic device detects a change in the opening and closing state of the touch screen, the electronic device performs the sensing function in the virtual sensing area, and detects whether the user has triggered the touch screen in the virtual sensing area. Enable operation.
  • the virtual sensing area may be set at the edge position of the touch screen.
  • the virtual sensing area may be set in one or more of the four corners of the touch screen; and/or, as shown in FIG. 6C, the virtual sensing area may be set in the touch screen One or two of the two edge areas near the left and right edges of the touch screen, as shown in the hatched area; and/or, as shown in FIG. 6D, the virtual sensing area can be set near the left and right edges of the touch screen One or more of the 6 edge regions.
  • the position, shape, and size of the virtual sensing area in FIGS. 6B to 6D are only examples, and are not used to limit the virtual sensing area in the embodiment of the present application.
  • one or more virtual sensing areas may be set in the electronic device in advance by the designer and production personnel of the electronic device.
  • the electronic device in FIG. The 6 areas of the are all virtual sensing areas; or, in another possible implementation, the design and production personnel of the electronic device can set multiple virtual sensing area options in the electronic device in advance, as shown in Figure 6E
  • the display shows 8 virtual sensing area options. The user selects one or more virtual sensing area options that meet his own habits on the virtual sensing area setting interface of the electronic device.
  • the virtual sensing area is set according to the virtual sensing area option selected by the user; it should be noted that the user can modify the preset virtual sensing area in the electronic device on the virtual sensing area setting interface of the electronic device, which is not limited in this embodiment of the application .
  • the UI effect when the UI effect is displayed on the touch screen, except that it may be similar to the embodiment shown in FIG. 3A, the UI effect is displayed on the entire touch screen or the UI effect is displayed on a part of the touch screen; this application In the embodiment, it is also possible to display the UI effect only in the preset virtual sensing area, or even to display the UI effect only in the virtual sensing area where the user's operation is detected. For example, if the preset virtual sensing areas on the electronic device are areas 1 to 6 in FIG.
  • the areas 1 to 6 can be all
  • the UI effect may be displayed, and the UI effect may be displayed only in area 1 and area 5, which is not limited in the embodiment of the present application.
  • the UI effect may refer to the description in the embodiment shown in FIG. 3A, or the UI effect displayed in the virtual sensing area may also include but is not limited to: special display of the virtual sensing area For example, the color, brightness, transparency, contrast, etc. of the virtual sensing area are changed, or dynamic effects are superimposed on the virtual sensing area, as shown in the embodiment of FIG. 3A.
  • step 701 may be further included before step 601:
  • Step 701 The electronic device shows the drawing interface to the user, and obtains the area drawn by the user in the drawing interface;
  • step 601 may include: the electronic device presets a virtual sensing area according to the area drawn by the user.
  • the area drawn by the user in the drawing interface may be one or more, which is not limited in the embodiment of the present application.
  • the electronic device can provide the user with
  • the virtual sensing area drawing interface allows the user to independently draw a virtual sensing area similar to that shown in FIG. 7C according to their own usage habits.
  • the user can redraw the virtual sensing area on the drawing interface provided by the electronic device, or modify the previously drawn virtual sensing area, and then modify the virtual sensing area pre-stored in the electronic device. Therefore, the virtual sensing area pre-stored in the electronic device can better match the user's usage habits.
  • the drawing interface is further illustrated in the embodiment of the present application shown in FIG. 8A Show the user a touch operation heat map or area map, as shown in FIG. 8A, before step 701 shown in FIG. 7A, the following steps may be further included:
  • Step 801 The electronic device samples the touch operation of the user on the touch screen during the process of opening or closing the electronic device (that is, during the process of changing the opening and closing state of the touch screen).
  • the sampling of the user's touch operation when opening and closing the touch screen can be sampled by the electronic device all the time.
  • this consumes relatively large power of the electronic device. Therefore, it is better to limit the sampling time period or sampling times. In order to reduce the power consumption of the electronic equipment by the sampling operation.
  • the sampling time period can be preset in the electronic device, for example, within one month after the electronic device starts to be used, or a certain fixed time period every day, etc.; or, it can be pre-set in the electronic device. Set the number of sampling times, for example, 1000 times.
  • the time period or the number of sampling operations can be triggered by the user.
  • the user can set the electronic device to start sampling for one month from a certain point in time, or set the electronic device to sample at a certain fixed time period every day, or set the electronic device to sample 100 times a day, and so on.
  • Step 802 The electronic device forms a heat map or an area map according to the sampled touch operation.
  • the heat map when the heat map is formed, the area with more touches by the user has higher heat power.
  • the heat map generally superimposes areas of different colors to describe the heat level of the non-stop area.
  • the areas with different heating values are displayed with different filling effects of the area.
  • the area with the most touches by the user that is, the area with the highest heat
  • the number of touches by the user is the second.
  • the area with the highest number of touches, that is, the area with the second highest heat is displayed as a diagonal area
  • the area with the third highest number of touch operations that is, the area with the third highest heat
  • the area with the least number of user touches is the heat.
  • the lowest area is shown as a horizontal striped area.
  • the touch screen when forming the area map, can be divided into several sub-areas, and the sub-areas whose number of touches reach the preset threshold are drawn to form the area map.
  • the specific threshold value can be set independently in practical applications. For example, as shown in Fig. 8C, if the touch screen is divided into 16 sub-areas, and the sub-areas where the number of touches reaches the preset threshold are 4 areas 5, 8, 9, 12, the area map drawn is as shown in Fig. 8D Show.
  • step 701 may further include: when the electronic device shows the drawing interface to the user, it also shows the heat map or the area map to the user.
  • the heat map or the area map may be displayed to the user as a background image of the drawing interface, for example, as shown in FIG. 8B and FIG. 8D.
  • the user can more intuitively understand their own habits, so that in step 701, the virtual sensing area that conforms to their own habits can be drawn more accurately and effectively.
  • step 601 Different from the virtual sensing area preset by the electronic device in the embodiment shown in FIG. 6A, or the user selects a preset virtual sensing area option, the embodiment shown in FIG. 9 provides another electronic device setting virtual sensing area. Regional approach. As shown in FIG. 9, based on the embodiment shown in FIG. 6A, the following steps may be further included before step 601:
  • Step 901 The electronic device divides the area of the touch screen to obtain several sub-areas.
  • Step 902 The electronic device samples the user's touch operations on the touch screen during the process of opening or closing the electronic device, and counts the number of times that each sub-region obtained by the division is sampled by the touch operations.
  • step 801 For the setting of the sampling timing, please refer to the related description in step 801, which will not be repeated here.
  • Step 903 The electronic device selects a preset number of sub-areas in the order of the number of touches according to the number of touches of each sub-areas.
  • the electronic device divides the touch screen into 16 sub-areas, and samples the user's touch operations on the touch screen during a period of time or a preset number of times during the process of opening or closing the electronic device.
  • 100 touch operations to open or close the electronic device count the number of times each sub-area was touched.
  • the times of each sub-area from high to low are: area 5-80 times, area 12-59 times, and area 9. -40 times, area 8-15 times...Other sub-areas will not be repeated.
  • the electronic device can select sub-area 5 and sub-area 12 as virtual sensing areas.
  • a machine learning method can also be used to identify the area that the user touches the most times during the process of opening and closing the electronic device.
  • step 601 in the embodiment of the present application may include: the electronic device presets a virtual sensing area according to the selected sub-area.
  • the preset virtual sensing area shown in FIG. 6A, the user-drawn virtual sensing area shown in FIG. 7A and FIG. 8A, and the virtual sensing area set by sampling in the embodiment shown in FIG. 9 are three
  • the virtual sensing area setting methods can be combined, and the user can independently decide which virtual sensing area setting method to use, and can choose to delete the virtual sensing area set by some methods.
  • the embodiment of this application does not limit.
  • step 601 is replaced with step 1001
  • step 1002 is added between step 602 and step 603, specifically:
  • Step 1001 The electronic device presets virtual sensing regions corresponding to each display scene.
  • the setting method of the virtual sensing area of each scene can refer to the setting method of the virtual sensing area in FIGS. 6A-9.
  • the main difference is that in the embodiment shown in FIG. 10A, the virtual sensing area corresponds to the display scene. Different display scenes may correspond to the same or different virtual sensing areas.
  • the user may open or close the electronic device under any application interface of the electronic device, such as playing a video or picture, browsing a webpage, or a game interface, etc.
  • different virtual sensing areas can be set for different display scenarios. Distinguish the virtual sensing area in the scene from the UI controls in the display scene, such as icons, images, buttons, and other elements that have display or hide control instructions, so that electronic devices can distinguish user operations during the process of opening and closing the touch screen Is it a disabling operation for the virtual sensing area, or an operation performed for the UI controls in the display scene.
  • the interface shown in FIG. 10B is being displayed, and the square part of the interface is a UI icon, then, if the preset virtual sensing area is the area 1 to 4 as shown in FIG. 10B Obviously, area 1 and area 4 will overlap with the UI icon. At this time, the virtual sensing area of the display scene can be set to area 2 and area 3.
  • step 1002 in the process of the opening and closing state of the touch screen being changed, obtain the display scene on the touch screen when the opening and closing state of the touch screen starts to change, and obtain the display scene Corresponding virtual sensing area, after that, go to step 603;
  • the virtual sensing area acquired in step 1002 in step 603 detects the user's operation.
  • Step 603 further includes step 1101:
  • Step 1101 During the change of the opening and closing state of the touch screen, the electronic device obtains the preset virtual sensing area, and obtains the display scene on the touch screen when the opening and closing state starts to change; from the virtual sensing area Select a virtual sensing area that does not overlap with the UI controls in the display scene.
  • steps 601 to 605 are shown based only on the step flow shown in FIG. 6A, and other possible implementation steps can be seen in FIGS. 7A to 9 and will not be repeated here.
  • the selected virtual sensing area can be specially displayed through the UI effect, thereby prompting the user of the virtual sensing area in the scene.
  • UI effect For specific display, please refer to Fig. 3A and Fig. 6A. The related UI effect description is not repeated here.
  • step 603 the electronic device detects the user's operation in the virtual sensing area selected according to the display scene.
  • the electronic device either sets a physical sensing area or sets a virtual sensing area.
  • the electronic device may also set both a physical sensing area and a virtual sensing area.
  • the implementation method refer to the above-mentioned embodiment, which will not be repeated here.
  • FIG. 12 is a structural diagram of an embodiment of an apparatus for controlling screen states of this application. As shown in FIG. 12, the apparatus 1200 may include:
  • the detection unit 1210 is used for the opening and closing state of the foldable touch screen starts to change, and the user's operation is detected in the preset sensing area;
  • the state setting unit 1210 is used to detect the user's operation in the preset sensing area, and set the touch screen to the disabled state.
  • the disabled state means that the touch screen does not respond to the user's touch operation; when the operation is valid When the time is over or the change of the opening and closing state ends, the disabling state of the touch screen is ended, and the effective time of the operation is the duration of the disabling state preset for the operation.
  • the preset sensing area includes: a physical sensing area, and/or a virtual sensing area; the physical sensing area is a physical area that is set outside the touch screen and can detect user operations,
  • the virtual sensing area is the area on the touch screen.
  • the preset sensing area includes: a virtual sensing area
  • the device 1200 may further include:
  • the first acquiring unit 1220 is used to acquire the preset virtual sensing area after the opening and closing state of the foldable touch screen starts to change, and to acquire the information on the touch screen when the opening and closing state of the touch screen starts to change. Show scene
  • the detection unit 1210 may be specifically configured to: select a first virtual sensing area that does not overlap with the UI controls of the user interface in the display scene from the acquired virtual sensing area; operate.
  • the apparatus 1200 may further include:
  • the first area setting unit is used for showing the drawing interface to the user; acquiring the area drawn by the user in the drawing interface, and setting the area as a virtual sensing area.
  • the area setting unit may be specifically used to: sample the user's touch operation on the touch screen during the change of the opening and closing state of the touch screen; form a heat map or area map according to the sampled touch operation; When showing the user the drawing interface, show the user a heat map or area map.
  • the apparatus 1200 may further include:
  • the second area setting unit is used to divide the area of the touch screen to obtain multiple sub-areas; in the process of changing the opening and closing state of the touch screen, sample the user's touch operation on the touch screen, and obtain the statistical division
  • the number of times each sub-area is touched by a touch operation according to the number of times each sub-area is touched, a preset number of sub-areas are selected as virtual sensing areas in descending order of the number of times.
  • the preset sensing area includes: a virtual sensing area
  • the device 1200 may further include:
  • the second acquiring unit is used to acquire the display scene on the touch screen when the opening and closing state of the touch screen starts to change after the opening and closing state of the foldable touch screen starts to change, and to acquire the virtual sensing corresponding to the display scene Area; where the virtual sensing area corresponding to the display scene is preset;
  • the detection unit 1210 may be specifically configured to: detect the user's operation in the virtual sensing area corresponding to the display scene.
  • the device 1200 may further include: an effect display unit for displaying a preset user interface UI effect on the touch screen after the touch screen is set to a disabled state, and the preset UI effect is used to prompt the user The touch screen is disabled.
  • the effect display unit may also be used to keep displaying the UI effect on the touch screen within a preset time period after the disabling state of the touch screen ends.
  • the apparatus 1200 may further include:
  • the process control unit is used to end the process triggered by the user during the current change process if the open and close state of the touch screen is detected to change in the opposite direction to the current change direction during the change of the opening and closing state.
  • the apparatus 1200 for controlling the screen state provided in the embodiment shown in FIG. 12 can be used to implement the technical solutions of the method embodiments shown in FIGS. 2A to 11 of this application. For its implementation principles and technical effects, further reference may be made to the related descriptions in the method embodiments.
  • the division of the various units of the screen state control device shown in FIG. 12 is only a division of logical functions, and may be fully or partially integrated into one physical entity in actual implementation, or may be physically separated.
  • these units can all be implemented in the form of software called by processing elements; they can also be implemented in the form of hardware; part of the units can also be implemented in the form of software called by the processing elements, and some of the units can be implemented in the form of hardware.
  • the state setting unit may be a separately established processing element, or it may be integrated in a certain chip of the electronic device.
  • the implementation of other units is similar.
  • all or part of these units can be integrated together or implemented independently.
  • each step of the above method or each of the above units can be completed by an integrated logic circuit of hardware in the processor element or instructions in the form of software.
  • the above units may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as ASIC), or, one or more micro-processing DSP (Digital Singnal Processor; hereinafter referred to as DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • these units can be integrated together and implemented in the form of a System-On-a-Chip (hereinafter referred to as SOC).
  • SOC System-On-a-Chip
  • FIG. 13 is a schematic structural diagram of an embodiment of an electronic device of this application. As shown in FIG. 13, the above-mentioned electronic device may include: a foldable touch screen; one or more processors; a memory; and one or more computer programs.
  • the above-mentioned one or more computer programs are stored in the above-mentioned memory, and the above-mentioned one or more computer programs include instructions.
  • the above-mentioned instructions are executed by the above-mentioned device, the above-mentioned device is caused to perform the following steps:
  • the opening and closing state of the foldable touch screen begins to change, and the user's operation is detected in the preset sensing area, and the touch screen is set to the disabled state.
  • the disabled state is the touch operation of the touch screen to the user Not responding
  • the effective time of the operation ends or the change of the opening and closing state ends, the disabling state of the touch screen ends, and the effective time of the operation is the duration of the disabling state preset for the operation.
  • the preset sensing area includes: a physical sensing area, and/or a virtual sensing area; the physical sensing area is set outside the touch screen and can perform user operations The physical area to be detected, and the virtual sensing area is the area on the touch screen.
  • the preset sensing area includes: a virtual sensing area.
  • the step of causing the device to perform the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the first virtual sensing area.
  • the device when the instruction is executed by the device, the device is caused to perform the following steps to preset the virtual sensing area:
  • the step of causing the device to perform the display of the drawing interface to the user includes:
  • the device when the instruction is executed by the device, the device is caused to perform the following steps to preset the virtual sensing area:
  • a preset number of sub-areas are selected as the virtual sensing area in the order of the number of times from high to low.
  • the preset sensing area includes: a virtual sensing area.
  • the step of causing the device to perform the operation of detecting the user in the preset sensing area includes:
  • the user's operation is detected in the virtual sensing area corresponding to the display scene.
  • the preset user interface UI effect is displayed on the touch screen, and the preset UI effect is used to remind the user that the touch screen is in the disabled state.
  • the UI effect is kept displayed on the touch screen.
  • the device when the instruction is executed by the device, the device further executes the following steps:
  • the electronic device shown in FIG. 13 may be a terminal device or a circuit device built in the aforementioned terminal device.
  • the device can be used to execute the functions/steps in the methods provided in the embodiments shown in FIG. 2A to FIG. 11 of the present application.
  • the electronic device 1300 may include a processor 1310, an external memory interface 1320, an internal memory 1321, a universal serial bus (USB) interface 1330, a charging management module 1340, a power management module 1341, a battery 1342, an antenna 1, and an antenna 2.
  • Mobile communication module 1350 wireless communication module 1360, audio module 1370, speaker 1370A, receiver 1370B, microphone 1370C, earphone jack 1370D, sensor module 1380, buttons 1390, motor 1391, indicator 1392, camera 1393, display 1394, and Subscriber identification module (subscriber identification module, SIM) card interface 1395, etc.
  • SIM Subscriber identification module
  • the sensor module 1380 can include pressure sensor 1380A, gyroscope sensor 1380B, air pressure sensor 1380C, magnetic sensor 1380D, acceleration sensor 1380E, distance sensor 1380F, proximity light sensor 1380G, fingerprint sensor 1380H, temperature sensor 1380J, touch sensor 1380K, ambient light Sensor 1380L, bone conduction sensor 1380M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 1300.
  • the electronic device 1300 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1310 may include one or more processing units.
  • the processor 1310 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 1310 to store instructions and data.
  • the memory in the processor 1310 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 1310. If the processor 1310 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 1310 is reduced, and the efficiency of the system is improved.
  • the processor 1310 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 1310 may include multiple sets of I2C buses.
  • the processor 1310 may be coupled to the touch sensor 1380K, charger, flash, camera 1393, etc., through different I2C bus interfaces.
  • the processor 1310 may couple the touch sensor 1380K through an I2C interface, so that the processor 1310 and the touch sensor 1380K communicate through an I2C bus interface to realize the touch function of the electronic device 1300.
  • the I2S interface can be used for audio communication.
  • the processor 1310 may include multiple sets of I2S buses.
  • the processor 1310 may be coupled with the audio module 1370 through an I2S bus to implement communication between the processor 1310 and the audio module 1370.
  • the audio module 1370 can transmit audio signals to the wireless communication module 1360 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 1370 and the wireless communication module 1360 may be coupled through a PCM bus interface.
  • the audio module 1370 may also transmit audio signals to the wireless communication module 1360 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 1310 and the wireless communication module 1360.
  • the processor 1310 communicates with the Bluetooth module in the wireless communication module 1360 through the UART interface to realize the Bluetooth function.
  • the audio module 1370 may transmit audio signals to the wireless communication module 1360 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 1310 with the display 1394, camera 1393 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 1310 and the camera 1393 communicate through a CSI interface to implement the shooting function of the electronic device 1300.
  • the processor 1310 and the display screen 1394 communicate through the DSI interface to realize the display function of the electronic device 1300.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 1310 with the camera 1393, the display screen 1394, the wireless communication module 1360, the audio module 1370, the sensor module 1380, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 1330 is an interface that complies with the USB standard specifications, and specifically can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 1330 can be used to connect a charger to charge the electronic device 1300, and can also be used to transfer data between the electronic device 1300 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic illustration, and does not constitute a structural limitation of the electronic device 1300.
  • the electronic device 1300 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 1340 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 1340 may receive the charging input of the wired charger through the USB interface 1330.
  • the charging management module 1340 may receive the wireless charging input through the wireless charging coil of the electronic device 1300. While the charging management module 1340 charges the battery 1342, it can also supply power to the electronic device through the power management module 1341.
  • the power management module 1341 is used to connect the battery 1342, the charging management module 1340 and the processor 1310.
  • the power management module 1341 receives input from the battery 1342 and/or the charging management module 1340, and supplies power to the processor 1310, internal memory 1321, display screen 1394, camera 1393, and wireless communication module 1360.
  • the power management module 1341 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 1341 may also be provided in the processor 1310.
  • the power management module 1341 and the charging management module 1340 may also be provided in the same device.
  • the wireless communication function of the electronic device 1300 can be implemented by the antenna 1, the antenna 2, the mobile communication module 1350, the wireless communication module 1360, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 1300 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 1350 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 1300.
  • the mobile communication module 1350 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 1350 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 1350 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 1350 may be provided in the processor 1310.
  • at least part of the functional modules of the mobile communication module 1350 and at least part of the modules of the processor 1310 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 1370A, a receiver 1370B, etc.), or displays an image or video through a display screen 1394.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 1310 and be provided in the same device as the mobile communication module 1350 or other functional modules.
  • the wireless communication module 1360 can provide applications on the electronic device 1300, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 1360 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1360 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1310.
  • the wireless communication module 1360 may also receive a signal to be sent from the processor 1310, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 1300 is coupled with the mobile communication module 1350, and the antenna 2 is coupled with the wireless communication module 1360, so that the electronic device 1300 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 1300 implements a display function through a GPU, a display screen 1394, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display 1394 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 1310 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display 1394 is used to display images, videos, etc.
  • the display screen 1394 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 1300 may include one or N display screens 1394, and N is a positive integer greater than one.
  • the electronic device 1300 can realize a shooting function through an ISP, a camera 1393, a video codec, a GPU, a display screen 1394, and an application processor.
  • the ISP is used to process the data fed back from the camera 1393. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 1393.
  • the camera 1393 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 1300 may include 1 or N cameras 1393, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 1300 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 1300 may support one or more video codecs. In this way, the electronic device 1300 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 1300 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 1320 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 1300.
  • the external memory card communicates with the processor 1310 through the external memory interface 1320 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 1321 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 1321 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 1300.
  • the internal memory 1321 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the processor 1310 executes various functional applications and data processing of the electronic device 1300 by running instructions stored in the internal memory 1321 and/or instructions stored in a memory provided in the processor.
  • the electronic device 1300 can implement audio functions through an audio module 1370, a speaker 1370A, a receiver 1370B, a microphone 1370C, a headphone interface 1370D, and an application processor. For example, music playback, recording, etc.
  • the audio module 1370 is used to convert digital audio information into an analog audio signal for output, and also used to convert an analog audio input into a digital audio signal.
  • the audio module 1370 can also be used to encode and decode audio signals.
  • the audio module 1370 may be provided in the processor 1310, or part of the functional modules of the audio module 1370 may be provided in the processor 1310.
  • the speaker 1370A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 1300 can listen to music through the speaker 1370A, or listen to a hands-free call.
  • the receiver 1370B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 1300 answers a call or voice message, it can receive the voice by bringing the receiver 1370B close to the human ear.
  • Microphone 1370C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound by approaching the microphone 1370C through the human mouth, and input the sound signal into the microphone 1370C.
  • the electronic device 1300 may be provided with at least one microphone 1370C. In other embodiments, the electronic device 1300 may be provided with two microphones 1370C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 1300 may also be provided with three, four or more microphones 1370C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 1370D is used to connect wired earphones.
  • the earphone interface 1370D may be a USB interface 1330, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 1380A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 1380A may be disposed on the display screen 1394.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 1380A, the capacitance between the electrodes changes.
  • the electronic device 1300 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 1394, the electronic device 1300 detects the intensity of the touch operation according to the pressure sensor 1380A.
  • the electronic device 1300 may also calculate the touched position according to the detection signal of the pressure sensor 1380A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyroscope sensor 1380B may be used to determine the movement posture of the electronic device 1300.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 1380B can be used for image stabilization.
  • the gyro sensor 1380B detects the shake angle of the electronic device 1300, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 1300 through reverse movement to achieve anti-shake.
  • the gyro sensor 1380B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 1380C is used to measure air pressure.
  • the electronic device 1300 calculates the altitude based on the air pressure value measured by the air pressure sensor 1380C, and assists positioning and navigation.
  • the magnetic sensor 1380D includes a Hall sensor.
  • the electronic device 1300 may use the magnetic sensor 1380D to detect the opening and closing of the flip holster.
  • the electronic device 1300 can detect the opening and closing of the flip according to the magnetic sensor 1380D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 1380E can detect the magnitude of the acceleration of the electronic device 1300 in various directions (generally three axes). When the electronic device 1300 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers, and so on.
  • the electronic device 1300 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 1300 may use the distance sensor 1380F to measure the distance to achieve fast focusing.
  • the proximity light sensor 1380G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 1300 emits infrared light to the outside through the light emitting diode.
  • the electronic device 1300 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 1300. When insufficient reflected light is detected, the electronic device 1300 may determine that there is no object near the electronic device 1300.
  • the electronic device 1300 can use the proximity light sensor 1380G to detect that the user holds the electronic device 1300 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 1380G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 1380L is used to sense the brightness of the ambient light.
  • the electronic device 1300 can adaptively adjust the brightness of the display screen 1394 according to the perceived brightness of the ambient light.
  • the ambient light sensor 1380L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 1380L can also cooperate with the proximity light sensor 1380G to detect whether the electronic device 1300 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 1380H is used to collect fingerprints.
  • the electronic device 1300 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 1380J is used to detect temperature.
  • the electronic device 1300 uses the temperature detected by the temperature sensor 1380J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 1380J exceeds the threshold, the electronic device 1300 performs a reduction in the performance of the processor located near the temperature sensor 1380J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 1300 when the temperature is lower than another threshold, the electronic device 1300 heats the battery 1342 to avoid abnormal shutdown of the electronic device 1300 due to low temperature.
  • the electronic device 1300 boosts the output voltage of the battery 1342 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 1380K also called “touch device”.
  • the touch sensor 1380K can be set on the display screen 1394, and the touch screen is composed of the touch sensor 1380K and the display screen 1394, which is also called a "touch screen”.
  • the touch sensor 1380K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 1394.
  • the touch sensor 1380K may also be disposed on the surface of the electronic device 1300, which is different from the position of the display screen 1394.
  • the bone conduction sensor 1380M can acquire vibration signals.
  • the bone conduction sensor 1380M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 1380M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 1380M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 1370 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 1380M, and realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 1380M, and realize the heart rate detection function.
  • the button 1390 includes a power button, a volume button, and so on.
  • the button 1390 may be a mechanical button. It can also be a touch button.
  • the electronic device 1300 may receive key input, and generate key signal input related to user settings and function control of the electronic device 1300.
  • the motor 1391 can generate vibration prompts.
  • the motor 1391 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 1394, the motor 1391 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 1392 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 1395 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 1395 or pulled out from the SIM card interface 1395 to achieve contact and separation with the electronic device 1300.
  • the electronic device 1300 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 1395 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 1395 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 1395 can also be compatible with different types of SIM cards.
  • the SIM card interface 1395 can also be compatible with external memory cards.
  • the electronic device 1300 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 1300 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 1300 and cannot be separated from the electronic device 1300.
  • the electronic device 1300 shown in FIG. 13 can implement various processes of the methods provided in the embodiments shown in FIGS. 2A to 11 of this application.
  • the operations and/or functions of each module in the electronic device 1300 are used to implement the corresponding processes in the foregoing method embodiments.
  • the device includes a storage medium and a central processing unit.
  • the storage medium may be a non-volatile storage medium.
  • a computer executable program is stored in the storage medium.
  • the central processing unit is connected to the The non-volatile storage medium is connected, and the computer executable program is executed to implement the method provided by the embodiment shown in FIG. 2A to FIG. 11 of this application.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when it runs on a computer, the computer executes the functions provided by the embodiments shown in Figs. 2A to 11 of the present application. method.
  • the embodiments of the present application also provide a computer program product.
  • the computer program product includes a computer program that, when running on a computer, causes the computer to execute the method provided by the embodiments shown in FIG. 2A to FIG. 11 of the present application.
  • At least one refers to one or more
  • multiple refers to two or more.
  • And/or describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item” and similar expressions refer to any combination of these items, including any combination of single items or plural items.
  • At least one of a, b, and c can mean: a, b, c, a and b, a and c, b and c, or a and b and c, where a, b, and c can be single, or There can be more than one.
  • any function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory; hereinafter referred to as ROM), random access memory (Random Access Memory; hereinafter referred to as RAM), magnetic disks or optical disks, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disks or optical disks etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente demande concernent un procédé de commande d'état d'écran et un dispositif électronique. Selon le procédé, un état de pliage/dépliage d'un écran tactile pliable commence à changer, une opération d'un utilisateur est détectée dans une zone de détection prédéfinie, et l'écran tactile est réglé comme étant dans un état de désactivation, l'état de désactivation étant que l'écran tactile ne produit pas de réponse à l'opération tactile de l'utilisateur ; lorsqu'une période de temps efficace de l'opération prend fin ou le changement de l'état de pliage/dépliage se termine, l'état de désactivation d'écran tactile prend fin, la période de temps effective de l'opération étant une durée de l'état de désactivation prédéfinie pour l'opération, de telle sorte que des opérations tactiles erronées sur l'écran tactile par l'utilisateur peuvent être réduites lorsque l'utilisateur plie/déplie l'écran tactile qui est dans un état activé.
PCT/CN2021/085662 2020-05-07 2021-04-06 Procédé de commande d'état d'écran et dispositif électronique WO2021223560A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010379242.1 2020-05-07
CN202010379242.1A CN113625865B (zh) 2020-05-07 2020-05-07 屏幕状态的控制方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021223560A1 true WO2021223560A1 (fr) 2021-11-11

Family

ID=78376944

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/085662 WO2021223560A1 (fr) 2020-05-07 2021-04-06 Procédé de commande d'état d'écran et dispositif électronique

Country Status (2)

Country Link
CN (1) CN113625865B (fr)
WO (1) WO2021223560A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116301424A (zh) * 2023-03-02 2023-06-23 瑞态常州高分子科技有限公司 基于压力触摸传感器的触摸识别系统
US12032789B1 (en) 2023-08-07 2024-07-09 Motorola Mobility Llc Extendable electronic device that mitigates inadvertent touch input during movement of a flexible display

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110157057A1 (en) * 2009-12-24 2011-06-30 Kyocera Corporation Mobile device, display control program, and display control method
CN106527818A (zh) * 2016-12-16 2017-03-22 广东欧珀移动通信有限公司 一种移动终端上触摸操作的控制方法、装置及移动终端
CN107636573A (zh) * 2016-07-27 2018-01-26 深圳市柔宇科技有限公司 防止误操作的显示界面控制方法、装置及终端
CN109871147A (zh) * 2019-02-22 2019-06-11 华为技术有限公司 一种触摸屏的响应方法及电子设备
CN110658936A (zh) * 2018-06-29 2020-01-07 中兴通讯股份有限公司 边缘抑制区域控制方法、装置、移动终端及存储介质
CN110837318A (zh) * 2019-10-29 2020-02-25 捷开通讯(深圳)有限公司 移动终端折叠屏的防误触方法、装置及存储介质
WO2020257979A1 (fr) * 2019-06-24 2020-12-30 深圳市柔宇科技有限公司 Procédé permettant d'eviter le toucher accidentel, appareil electronique et support d'enregistrement non volatil lisible par ordinateur

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2770418A1 (fr) * 2013-02-25 2014-08-27 Samsung Electronics Co., Ltd Appareil électronique comprenant un écran tactile et procédé de commande associé
CN105700709B (zh) * 2016-02-25 2019-03-01 努比亚技术有限公司 一种移动终端及控制移动终端不可触控区域的方法
CN106527799B (zh) * 2016-10-26 2020-01-24 南通奥拓自控设备有限公司 防止按键被误操作的方法与装置
CN109710111B (zh) * 2018-12-30 2021-05-18 联想(北京)有限公司 一种防误触方法及电子设备
CN109840061A (zh) * 2019-01-31 2019-06-04 华为技术有限公司 控制屏幕显示的方法及电子设备
CN109992189B (zh) * 2019-02-22 2021-05-11 华为技术有限公司 屏幕控制方法、电子设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110157057A1 (en) * 2009-12-24 2011-06-30 Kyocera Corporation Mobile device, display control program, and display control method
CN107636573A (zh) * 2016-07-27 2018-01-26 深圳市柔宇科技有限公司 防止误操作的显示界面控制方法、装置及终端
CN106527818A (zh) * 2016-12-16 2017-03-22 广东欧珀移动通信有限公司 一种移动终端上触摸操作的控制方法、装置及移动终端
CN110658936A (zh) * 2018-06-29 2020-01-07 中兴通讯股份有限公司 边缘抑制区域控制方法、装置、移动终端及存储介质
CN109871147A (zh) * 2019-02-22 2019-06-11 华为技术有限公司 一种触摸屏的响应方法及电子设备
WO2020257979A1 (fr) * 2019-06-24 2020-12-30 深圳市柔宇科技有限公司 Procédé permettant d'eviter le toucher accidentel, appareil electronique et support d'enregistrement non volatil lisible par ordinateur
CN110837318A (zh) * 2019-10-29 2020-02-25 捷开通讯(深圳)有限公司 移动终端折叠屏的防误触方法、装置及存储介质

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116301424A (zh) * 2023-03-02 2023-06-23 瑞态常州高分子科技有限公司 基于压力触摸传感器的触摸识别系统
CN116301424B (zh) * 2023-03-02 2023-10-31 瑞态常州高分子科技有限公司 基于压力触摸传感器的触摸识别系统
US12032789B1 (en) 2023-08-07 2024-07-09 Motorola Mobility Llc Extendable electronic device that mitigates inadvertent touch input during movement of a flexible display

Also Published As

Publication number Publication date
CN113625865A (zh) 2021-11-09
CN113625865B (zh) 2023-06-06

Similar Documents

Publication Publication Date Title
WO2021213120A1 (fr) Procédé et appareil de projection d'écran et dispositif électronique
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2021052290A1 (fr) Procédé de réglage de volume et dispositif électronique
WO2020168965A1 (fr) Procédé de commande d'un dispositif électronique à écran pliant et dispositif électronique
CN110536004B (zh) 多传感器应用于具有柔性屏幕的电子设备的方法及电子设备
WO2021052279A1 (fr) Procédé d'affichage sur écran pliable, et dispositif électronique
WO2020224449A1 (fr) Procédé de manœuvre d'affichage à écran partagé et dispositif électronique
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021036771A1 (fr) Dispositif électronique comprenant un écran pliable, et procédé d'affichage
WO2021208723A1 (fr) Procédé et appareil d'affichage plein écran, et dispositif électronique
US20240073305A1 (en) Touchscreen, Electronic Device, and Display Control Method
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
CN110286972A (zh) 一种折叠屏显示应用的方法及电子设备
CN110798568B (zh) 具有折叠屏的电子设备的显示控制方法及电子设备
WO2021082564A1 (fr) Procédé d'invite d'opération et dispositif électronique
WO2021180089A1 (fr) Procédé et appareil de commutation d'interface et dispositif électronique
WO2020118490A1 (fr) Procédé de division d'écran automatique, interface utilisateur graphique et dispositif électronique
CN111124201A (zh) 一种单手操作的方法和电子设备
WO2021238370A1 (fr) Procédé de commande d'affichage, dispositif électronique, et support de stockage lisible par ordinateur
WO2021052407A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
WO2023131070A1 (fr) Procédé de gestion de dispositif électronique, dispositif électronique et support de stockage lisible
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2021223560A1 (fr) Procédé de commande d'état d'écran et dispositif électronique
CN110658975A (zh) 一种移动终端操控方法及装置
US20240338163A1 (en) Multi-screen unlocking method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21799453

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21799453

Country of ref document: EP

Kind code of ref document: A1