WO2022037408A1 - Procédé d'affichage et dispositif électronique - Google Patents

Procédé d'affichage et dispositif électronique Download PDF

Info

Publication number
WO2022037408A1
WO2022037408A1 PCT/CN2021/110431 CN2021110431W WO2022037408A1 WO 2022037408 A1 WO2022037408 A1 WO 2022037408A1 CN 2021110431 W CN2021110431 W CN 2021110431W WO 2022037408 A1 WO2022037408 A1 WO 2022037408A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display screen
area
card
display
Prior art date
Application number
PCT/CN2021/110431
Other languages
English (en)
Chinese (zh)
Inventor
高凌云
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022037408A1 publication Critical patent/WO2022037408A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • H04M1/0216Foldable in one direction, i.e. using a one degree of freedom hinge
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/441Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
    • H04N21/4415Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card using biometric characteristics of the user, e.g. by voice recognition or fingerprint scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present application relates to the technical field of electronic devices, and in particular, to a display method and electronic device.
  • a user needs to open a specific interface such as a payment code and a bus code
  • the user often needs to perform multiple operations. For example, if a user needs to use a payment code when watching a video through a video application on a mobile phone, the user needs to perform multiple operations including: exiting the video application or placing it in the background of the mobile phone, finding the entrance of the payment application and opening the application, and then Look for a control in the app where you can view the payment code, and finally select the control to view the payment code.
  • the user's operation is cumbersome and the sense of use is poor.
  • the embodiment of the present application discloses a display method and an electronic device, which can conveniently and quickly open a preset interface of an electronic card including a payment code, a bus code, etc., which greatly simplifies the user's operation and improves the user's sense of use.
  • an embodiment of the present application provides a display method, the method is applied to a foldable electronic device, the electronic device includes a first display screen and a second display screen; when the electronic device is in an unfolded state, the first display screen The light-emitting surface of the screen is opposite to the light-emitting surface of the second display screen; the first display screen includes a first display area and a second display area; when the electronic device is in an unfolded state, the first display area and the second display area are on the same plane; When the electronic device is in a bent state, the angle between the plane where the first display area is located and the plane where the second display area is located is less than 180 degrees; the method includes: detecting a touch operation acting on the first display screen, wherein the electronic device in the above bending state; the touch operation acts on the first touch area of the first display area, and also acts on the second touch area of the second display area; in response to the touch operation, the display is displayed on the second display screen Default interface.
  • the user needs to open the preset interface, there is no need to search for a control for opening the preset interface, and there is no need to perform multiple user operations for opening the preset interface.
  • the electronic device is in a bent state, the user can quickly open the preset interface directly through the above-mentioned touch operation acting on the first display screen.
  • the user's operation process is greatly simplified, the user is convenient to use, and the practicability of the electronic device is also higher.
  • the angle between the plane where the first display area is located and the plane where the second display area is located is greater than 15 degrees and less than 30 degrees.
  • the intersection line where the plane where the first display area is located and the plane where the second display area is located is the central axis of the first display screen;
  • the projection on the central axis and the projection of the second touch area on the central axis at least partially overlap.
  • the first touch area and the second touch area are defined to avoid the situation where the user opens the preset interface by mistake (for example, when the user's thumb touches the first display area to operate the electronic device, the index finger accidentally touches the the case of the second display area), thereby improving the user experience.
  • the preset interface displayed on the second display screen is: The first interface; when the first touch area is located at the third position of the first display area, and the second touch area is located at the fourth position of the second display area, the preset interface displayed on the second display screen is the second interface.
  • the preset interface displayed by the electronic device may be different. Therefore, the user can quickly open different preset interfaces through touch operations acting on different positions on the first display screen. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the above-mentioned preset interface displays one or more electronic cards.
  • electronic cards are two-dimensional codes such as payment codes, travel codes, personal business cards, barcodes such as payment codes, order codes, express tracking numbers, etc., personal documents such as ID cards and social security cards, and transportation tickets such as air tickets and train tickets.
  • Preset information such as ticket, movie ticket, attraction ticket, bank card, membership card, or a record message in the memo.
  • the preset interface displays multiple electronic cards
  • the multiple electronic cards include a first card and a second card; the preset interface displays the entire content of the first card and displays part of the content of the second card .
  • the preset interface displays multiple electronic cards
  • the multiple electronic cards include the first card and the second card
  • the preset interface displays the entire contents of the first card and the second card, and the first card In the fifth position, the second card is in the sixth position, and the fifth position and the sixth position are different.
  • the first card may be one or more cards.
  • the above-mentioned displaying the entire content of the first card on the preset interface is specifically: displaying the preset first card among the multiple cards of the first card on the preset interface.
  • the second card may be one or more cards.
  • the above-mentioned displaying the entire content of the second card on the preset interface is specifically: displaying a preset second card among the multiple cards of the second card on the preset interface.
  • the method further includes: on the preset interface displaying the entire content of the first card and displaying part of the content of the second card, detecting a third operation; in response to the third operation, in the preset interface The interface displays the entire content of the second card, and displays part of the content of the first card.
  • the third operation acts on the first display screen.
  • the third operation acts on the second display screen.
  • the third operation acts on the first card displayed on the second display screen.
  • the third operation acts on the second card displayed on the second display screen.
  • the user may select an electronic card to be preferentially displayed on the preset interface (ie, the card on which the preset interface displays all the contents) through the third operation according to the usage requirements. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the method further includes: displaying the first card at the fifth position of the preset interface, and when displaying the second card at the sixth position of the preset interface, detecting a fourth operation; in response to the fourth Operation, display the first card in the sixth position of the preset interface, and display the second card in the fifth position of the preset interface.
  • the fourth operation acts on the first display screen.
  • the fourth operation acts on the second display screen.
  • the fourth operation acts on the first card displayed on the second display screen.
  • the fourth operation acts on the second card displayed on the second display screen.
  • the user can switch the positions of the plurality of electronic cards displayed on the preset interface through the fourth operation according to the usage requirements and habits. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the method before the preset interface is displayed on the second display screen, the method further includes: collecting the biometric information of the user; verifying the biometric information; and displaying the preset interface on the second display screen.
  • the interface includes: in response to the verification of the biometric information passing, displaying a preset interface on the second display screen.
  • the biometric information includes at least one of the following: face information, fingerprint information, voiceprint information, or iris information.
  • the user does not feel the process of collecting the biometric information of the user and verifying the biometric information by the electronic device.
  • the preset interface is displayed on the second display screen only when the biometric information verification is passed, which not only ensures the security of the information, but also does not require the user to manually perform the process of identity verification, and the user has a better sense of use.
  • the method before detecting the touch operation acting on the first display screen, the method further includes: displaying the interface of the first application on the first display screen and/or the second display screen; After the display screen displays the preset interface, the method further includes: receiving a fifth operation; in response to the fifth operation, displaying the interface of the first application on the first display screen and/or the second display screen.
  • the electronic device is not in a bent state and/or the electronic device does not detect the above-mentioned touch operation acting on the first display screen.
  • the electronic device before the above-mentioned detection of the touch operation acting on the first display screen, the electronic device is in an unfolded state, and the interface of the first application is displayed on the first display screen.
  • the fifth operation is an unfolding operation. After the user performs the fifth operation, the electronic device is not in a bent state. In response to the fifth operation, the electronic device does not display the preset interface on the second display screen, but displays the interface of the first application on the first display screen.
  • the electronic device when the electronic device is not in a bent state and/or the above-mentioned touch operation acting on the first display screen is not detected, the electronic device can resume displaying the second display screen before displaying the preset interface.
  • embodiments of the present application provide a method for displaying a graphical user interface (GUI), which is applied to a foldable electronic device, where the electronic device includes a first display screen and a second display screen; When the device is in the unfolded state, the light-emitting surface of the first display screen is opposite to the light-emitting surface of the second display screen; the first display screen includes a first display area and a second display area; when the electronic device is in the unfolded state, the first display area and the second display area are on the same plane; when the electronic device is in a bent state, the angle between the plane where the first display area is located and the plane where the second display area is located is less than 180 degrees; the method includes: the electronic device detects an effect on the first display area.
  • GUI graphical user interface
  • the touch operation acts on the first touch area of the first display area, and also acts on the second touch area of the second display area; in response to the For the above touch operation, a preset GUI is displayed on the second display screen.
  • the user needs to open the preset GUI, there is no need to search for a control for opening the preset GUI, and there is no need to perform multiple user operations for opening the preset GUI.
  • the electronic device is in a bent state, the user can quickly open the preset GUI directly through the above-mentioned touch operation acting on the first display screen.
  • the user's operation process is greatly simplified, the user is convenient to use, and the practicability of the electronic device is also higher.
  • the preset GUI displayed on the second display screen is: The first GUI; when the first touch area is located at the third position of the first display area, and the second touch area is located at the fourth position of the second display area, the preset GUI displayed on the second display screen is the second GUI.
  • the preset GUI displayed by the electronic device may be different. Therefore, the user can quickly open different preset GUIs through touch operations acting on different positions on the first display screen. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the above-mentioned preset GUI displays one or more electronic cards.
  • electronic cards are two-dimensional codes such as payment codes, travel codes, personal business cards, barcodes such as payment codes, order codes, express tracking numbers, etc., personal documents such as ID cards and social security cards, and transportation tickets such as air tickets and train tickets.
  • Preset information such as ticket, movie ticket, attraction ticket, bank card, membership card, or a record message in the memo.
  • the preset GUI displays multiple electronic cards
  • the multiple electronic cards include a first card and a second card; the preset GUI displays the entire content of the first card and displays part of the content of the second card .
  • the multiple electronic cards include a first card and a second card; the preset GUI displays all contents of the first card and the second card, and the first card In the fifth position, the second card is in the sixth position, and the fifth position and the sixth position are different.
  • the first card may be one or more cards.
  • the above-mentioned displaying the entire content of the first card in the preset GUI is specifically: displaying the preset first card among the multiple cards of the first card in the preset GUI.
  • the second card may be one or more cards.
  • the above-mentioned displaying the entire content of the second card in the preset GUI is specifically: displaying the preset second card among the multiple cards of the second card in the preset GUI.
  • the method further includes: when the preset GUI displays the entire content of the first card and displays part of the content of the second card, detecting a third operation; in response to the third operation, in the preset GUI The GUI displays the entire contents of the second card and displays part of the contents of the first card.
  • the user may select an electronic card that is preset to be displayed preferentially by the GUI (ie, a card whose GUI is preset to display all contents) through a third operation according to usage requirements. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the method further includes: displaying the first card in the fifth position of the preset GUI, and when displaying the second card in the sixth position of the preset GUI, detecting a fourth operation; in response to the fourth The operation is to display the first card in the sixth position of the preset GUI, and display the second card in the fifth position of the preset GUI.
  • the user can switch the positions of the plurality of electronic cards displayed by the preset GUI through the fourth operation according to the usage requirements and habits. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the above-mentioned displaying the preset GUI on the second display screen includes: in response to passing the verification of the user's biometric information, displaying the preset GUI on the second display screen.
  • the user does not feel the process of collecting the biometric information of the user and verifying the biometric information by the electronic device.
  • the preset interface is displayed on the second display screen only when the biometric information verification is passed, which not only ensures the security of the information, but also does not require the user to manually perform the process of identity verification, and the user has a better sense of use.
  • the method before detecting the touch operation acting on the first display screen, the method further includes: displaying the GUI of the first application on the first display screen and/or the second display screen; After the second display screen displays the preset GUI, the method further includes: receiving a fifth operation; and in response to the fifth operation, displaying the GUI of the first application on the first display screen and/or the second display screen.
  • the electronic device is not in a bent state and/or the electronic device does not detect the above-mentioned touch operation acting on the first display screen.
  • the electronic device before the above-mentioned detection of the touch operation acting on the first display screen, the electronic device is in an unfolded state, and the GUI of the first application is displayed on the first display screen.
  • the fifth operation is an unfolding operation. After the user performs the fifth operation, the electronic device is not in a bent state. In response to the fifth operation, the electronic device does not display the preset GUI on the second display screen, but displays the GUI of the first application on the first display screen.
  • the electronic device when the electronic device is not in a bent state and/or the above-mentioned touch operation acting on the first display screen is not detected, the electronic device can resume displaying the second display screen before displaying the preset GUI.
  • embodiments of the present application provide an electronic device that is foldable, and the electronic device includes a first display screen, a second display screen, a memory, and one or more processors; when the electronic device is unfolded In the state, the light-emitting surface of the first display screen and the light-emitting surface of the second display screen are opposite to each other; the first display screen includes a first display area and a second display area; when the electronic device is in the above-mentioned unfolded state, the first display area and the second display area are opposite.
  • the second display area is on the same plane; when the electronic device is in a bent state, the angle between the plane where the first display area is located and the plane where the second display area is located is less than 180 degrees; the above-mentioned memory is used to store one or more A computer program, the one or more processors are used to call the one or more computer programs, and the one or more computer programs include instructions that, when executed by the one or more processors, cause the electronic device to execute the first step.
  • a display method provided by any one of the implementation manners of the one aspect, the second aspect, the first aspect and the second aspect.
  • embodiments of the present application provide a computer-readable storage medium, including instructions, when the instructions are executed on an electronic device, the electronic device is made to perform the first aspect, the second aspect, the first aspect, and the second aspect
  • the display method provided by any one of the implementation manners of the aspect.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic device, enables the electronic device to perform any one of the first aspect, the second aspect, the first aspect and the second aspect A display method provided by an implementation.
  • an embodiment of the present application provides a chip, the chip includes at least one processor and an interface circuit, and optionally, the chip further includes a memory; the above-mentioned memory, the above-mentioned interface circuit and the above-mentioned at least one processor are interconnected through a line , a computer program is stored in the at least one memory; when the computer program is executed by the processor, the display method provided by any one of the first aspect, the second aspect, the first aspect, and the second aspect is implemented.
  • the electronic device provided in the third aspect, the computer storage medium provided in the fourth aspect, the computer program product provided in the fifth aspect, and the chip provided in the sixth aspect are all used to execute the first aspect, the second aspect, and the third aspect.
  • the display method provided by any one of the implementation manners of the one aspect and the second aspect Therefore, for the beneficial effects that can be achieved, reference may be made to the beneficial effects of the display methods provided in the first aspect and the second aspect, which will not be repeated here.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • 3-5 are schematic diagrams of physical states of some electronic devices provided by embodiments of the present application.
  • FIGS. 6-9 are schematic diagrams of some human-computer interactions provided by embodiments of the present application.
  • FIGS. 10A-10D are schematic diagrams of some first display screens provided by embodiments of the present application.
  • FIG. 26 is a schematic flowchart of a display method provided by an embodiment of the present application.
  • FIG. 27 is a schematic flowchart of another display method provided by an embodiment of the present application.
  • the electronic device is configured with a foldable display screen (which may be referred to as a folding screen), and the electronic device may be referred to as a foldable electronic device.
  • the folding of the folding screen can also be called the folding of the electronic device, and the physical state of the folding screen can also be called the physical state of the electronic device.
  • the following embodiments refer to the foldable electronic device simply as an electronic device for description.
  • the electronic device may include a first display screen and a second display screen, and the first display screen may include a first display area and a second display area.
  • the first display screen may include a first display area and a second display area.
  • a display screen 200, a second display screen 300, a first display area 201 and a second display area 202 will not be described in detail for now.
  • the physical states of the electronic device may include three types: an unfolded state, a bent state, and a folded state.
  • the unfolded state may be a state in which the bending angle of the first display screen is equal to 180 degrees.
  • the bent state may be a state in which the bending angle of the first display screen is greater than 0 degrees and less than 180 degrees.
  • the folded state may be a state in which the bending angle of the first display screen is equal to 0 degrees.
  • the bending angle of the first display screen is specifically an angle between the plane where the first display area is located and the plane where the second display area is located.
  • the electronic devices involved in the embodiments of the present application may be, but are not limited to, mobile phones, tablet computers, personal digital assistants (PDAs), handheld computers, wearable electronic devices (such as smart watches, smart bracelets), augmented reality (augmented) terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR glasses), smart home devices such as smart TVs, or other desktop, laptop, notebook, super mobile Personal computer (Ultra-mobile Personal Computer, UMPC), netbook and other equipment.
  • PDAs personal digital assistants
  • wearable electronic devices such as smart watches, smart bracelets
  • augmented reality (augmented) terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR glasses), smart home devices such as smart TVs, or other desktop, laptop, notebook, super mobile Personal computer (Ultra-mobile Personal Computer, UMPC), netbook and other equipment.
  • augmented reality terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110 , an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • a processor 110 an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receive
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, angle sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 of the electronic device 100 is a folding screen.
  • display screen 194 may be an integrally formed flexible folding screen.
  • the first display screen and the second display screen may be different display areas on the flexible folding screen.
  • the display screen 194 may also be a spliced display screen composed of two flexible display screens.
  • the first display screen and the second display screen may be the two flexible display screens.
  • the display screen 194 may also be a spliced screen composed of multiple rigid screens and a flexible screen and other connecting components located between any two rigid screens.
  • the electronic device 100 may include four rigid screens and two flexible screens, wherein the two rigid screens and a flexible screen between the two rigid screens constitute the first display screen of the electronic device 100 , the other two rigid screens and A flexible screen between the two rigid screens constitutes the second display screen of the electronic device 100 .
  • the first display screen and the second display screen reference may be made to the first display screen 200 and the second display screen 300 shown in FIGS. 3-5 below, which will not be described in detail for now.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the electronic device 100 can realize face unlock, access application lock, etc. through the face information obtained by the photographing function.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 can obtain the corresponding touch operation intensity according to the detection signal of the pressure sensor 180A.
  • the electronic device 100 may also calculate the position of the touch area on the display screen 194 (referred to as the touch position for short) by the touch operation according to the detection signal of the pressure sensor 180A.
  • the electronic device 100 may also calculate the shape of the above-mentioned touch area according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation commands. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, the instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the gyro sensor 180B may be disposed on the display screen 194 for detecting the bending angle of the display screen 194 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. In some embodiments, the acceleration sensor 180E may be disposed on the display screen 194 for detecting the bending angle of the display screen 194 .
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also referred to as a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can transmit the detected touch operation to the application processor to determine the position and shape of the touch area on the display screen 194 where the touch operation acts on the touch area, so as to determine the type of touch event.
  • Electronic device 100 may provide visual output related to touch operations through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the angle sensor 180M can acquire the angle information and convert it into a usable electrical signal output.
  • the angle sensor 180M may be disposed in the display screen 194 for detecting the bending angle of the display screen 194 .
  • the processor 110 may determine the physical state of the electronic device 100 (ie, the unfolded state, the bent state or the folded state) according to the bending angle of the display screen 194 detected by the angle sensor 180M.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the processor 110 may determine that the electronic device 100 is in a bent state.
  • the electronic device 100 is in a bent state, if the pressure sensor 180A and/or the touch sensor 180K detects a touch operation on the first display screen, a corresponding hardware interrupt may be reported to the processor 110 .
  • the processor 110 controls the second display screen to display a preset interface.
  • the preset interval may be (0°, 180°).
  • the preset interval may be (0°, 90°), (0°, 60°), (0°, 45°), (0°, 30°), (0°, 15°), ( 15°, 30°), (15°, 45°), (15°, 60°), (15°, 90°), etc.
  • the above-mentioned preset interface may include at least one electronic card.
  • electronic cards can be QR codes such as payment codes, travel codes, personal business cards, barcodes such as payment codes, order codes, express tracking numbers, etc., personal documents such as ID cards and social security cards, and transportation tickets such as air tickets and train tickets. , movie tickets, attraction tickets, bank cards, membership cards, or a record message in the memo and other preset information.
  • the content displayed on the preset interface may also be an icon of at least one application program, and the embodiment of the present application does not limit the specific content displayed on the preset interface.
  • the above-mentioned first display screen may include a first display area and a second display area.
  • first display area and the second display area please refer to the first display area 201 and the second display area shown in FIGS. 3-5 below. 202.
  • the bending angle of the display screen 194 and the bending angle of the first display screen may specifically be the angle between the plane where the first display area is located and the plane where the second display area is located.
  • the above touch operation acts on the first touch area and the second touch area of the first display screen, the first touch area is located in the first display area, and the second touch area is located in the second display area.
  • the application processor may sleep, and optionally, some sensors may also sleep.
  • the application processor is woken up only when preset conditions are met.
  • the application processor determines whether the electronic device 100 is in a bent state according to the angle detected by the angle sensor 180M, and determines whether the above-mentioned touch operation acting on the first display screen is received according to the detection signals of the pressure sensor 180A and/or the touch sensor 180K. Therefore, unnecessary power consumption overhead is avoided, and the battery life of the electronic device 100 is improved.
  • the preset conditions are, for example, but not limited to: the bending angle of the display screen 194 detected by the angle sensor 180M changes, the gyro sensor 180B and/or the acceleration sensor 180E detects a hovering operation or a lifting operation, the pressure sensor 180A and/or Or a touch operation or the like is detected by the touch sensor 180K.
  • the above-mentioned preset interface is an interface that is displayed only when the user's biometric information is verified.
  • the biometric information may be, but is not limited to, face information, fingerprint information, voiceprint information, iris information, pulse information, heart rate information, gait information, and the like.
  • a corresponding hardware interrupt may be reported to the processor 110 .
  • the processor 110 may instruct the camera 193 to acquire the user's face information and verify the face information.
  • the processor 110 controls the second display screen to display a preset interface only when the face information verification is passed.
  • the gyro sensor 180B and/or the acceleration sensor 180E provided in the display screen 194 may also be used to detect the bending angle of the display screen 194 .
  • the gyro sensor 180B can detect the magnitude of the angular velocity of the display screen 194 in various directions (generally three axes).
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the display screen 194 in various directions (generally three axes).
  • the processor 110 may determine the bending angle of the display screen 194 according to the angular velocity information detected by the gyro sensor 180B and/or the acceleration information detected by the acceleration sensor 180E, thereby determining the physical state of the electronic device 100 .
  • the embodiment of the present application does not limit the specific type of the sensor for detecting the physical state.
  • the embodiments of the present application do not limit the specific type of the sensor for detecting the above-mentioned touch operation acting on the first display screen.
  • the angle sensor 180M is used for detecting the bending angle of the display screen 194, and the pressure sensor 180A is used for detecting the above-mentioned touch operation on the first display screen as an example for description.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include payment application, banking application, card package application, ticket purchasing application, camera application, map application and other applications.
  • PayPal For example, Huawei Smart Assistant, Huawei Wallet, etc.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the above data can include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the preset interface including the payment code may include a view for displaying text and a view for displaying a QR code picture.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications that appear on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the sensor driver can be used to drive multiple sensors in the control hardware, such as pressure sensor 180A, gyro sensor 180B, acceleration sensor 180E, touch sensor 180K, angle sensor 180M and other sensors shown in FIG. 1 .
  • the workflow of the software and hardware of the electronic device 100 will be exemplarily described below in conjunction with the scenario of triggering the display of the preset interface.
  • the processor 110 may determine that the electronic device 100 is in a bending state according to the angle detected by the angle sensor 180M.
  • the electronic device 100 is in a bent state, if the pressure sensor 180A detects the above-mentioned touch operation on the first display screen, a corresponding hardware interrupt can be reported to the kernel layer.
  • the kernel layer can process the user operation corresponding to the hardware interrupt into the original input event (including the time stamp of the above touch operation, the number, position, area, shape and other information of the touch area that the above touch operation acts on the first display screen) ).
  • Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and determines the corresponding preset interface according to the input event.
  • the preset interface corresponding to the input event may include the payment code of the payment application. Therefore, the payment application can call the interface of the application framework layer, start the payment application, and then start the display driver by calling the kernel layer, and display the preset interface including the payment code of the payment application through the second display screen.
  • FIG. 3 is a schematic diagram of an unfolded state of an electronic device provided by an embodiment of the present application.
  • 3(A) shows the first display screen 200 of the electronic device 100 in an unfolded state.
  • (B) of FIG. 3 shows the second display screen 300 of the electronic device 100 in the unfolded state.
  • the first display screen 200 of the electronic device 100 may include three areas, that is, a first display area 201 , a second display area 202 , and a third area 203 .
  • the third region 203 can be bent. Both ends of the bent portion of the electronic device 100 are respectively connected to the first display area 201 and the second display area 202 .
  • the angle ⁇ between the two ends of the bending portion is equal to 180 degrees, the electronic device 100 can be in the unfolded state.
  • the plane where the first display area 201 is located intersects with the plane where the second display area 202 is located, and the intersection line may be referred to as a central axis.
  • the angle between the two ends of the above-mentioned bending part is also the angle between the plane where the first display area 201 is located and the plane where the second display area 202 is located.
  • the angle ⁇ between the plane where the first display area 201 is located and the plane where the second display area 202 is located is equal to 180 degrees, the electronic device 100 can be in the unfolded state.
  • the first display area 201 and the second display area 202 are located on the same plane, that is, the plane where the first display screen 200 shown in (A) of FIG. 3 is located.
  • the electronic device 100 shown in FIG. 3(B) can be obtained by flipping the electronic device 100 shown in FIG. 3(A) around the central axis by 180 degrees. That is to say, when the electronic device is in the unfolded state, the light emitting surface of the first display screen 200 and the light emitting surface of the second display screen 300 shown in (A) of FIG. 3 are opposite to each other.
  • the second display screen 300 of the electronic device 100 may include three areas, namely, a fourth area 301 , a fifth area 302 , and a sixth area 303 .
  • the fifth region 302 can be bent. Two ends of the bent portion of the electronic device 100 are respectively connected to the fourth area 301 and the sixth area 303 .
  • the electronic device 100 may include at least one camera.
  • the electronic device 100 may include a camera 3031 (including the camera 3031A and the camera 3031B) disposed in the upper part of the sixth area 303 , and a camera 3032 (including the camera 3032A and the camera 3032B) in the lower part of the sixth area 303 .
  • the electronic device 100 may obtain the user's face information through the above-mentioned camera, and perform face verification according to the obtained face information.
  • the electronic device 100 may also only include the camera 3031 disposed on the upper part of the sixth area 303 .
  • the electronic device 100 may only include the camera 3032 disposed at the lower part of the sixth area 303 .
  • the electronic device 100 may also include a camera disposed in the fourth area 301 and/or the fifth area 302 .
  • the electronic device 100 may also include a camera disposed on the first display screen 200 , which is not limited in this embodiment of the present application.
  • the electronic device 100 in the unfolded state shown in FIG. 3 may be partially folded (for example, the angle between the two ends of the folded portion may be changed from ⁇ to ⁇ ) to obtain the electronic device in the folded state 100 , such as the electronic device 100 shown in FIG. 4 .
  • FIG. 4 is a schematic diagram of a bent state of an electronic device provided by an embodiment of the present application.
  • 4(A) shows the first display screen 200 of the electronic device 100 in a bent state.
  • (B) of FIG. 4 shows the second display screen 300 of the electronic device 100 in a bent state.
  • the electronic device 100 when the angle ⁇ between the two ends of the bending portion is greater than 0 degrees and less than 180 degrees, the electronic device 100 may be in a bent state.
  • the value range of ⁇ is (0°, 90°), (0°, 60°), (0°, 45°), (0°, 30°), (0°, 15°) , (15°, 30°), (15°, 45°), (15°, 60°) or (15°, 90°), etc.
  • the electronic device 100 shown in FIG. 4(B) may be obtained by turning the electronic device 100 shown in FIG. 4(A) 180 degrees around the central axis.
  • the electronic device 100 in the unfolded state shown in FIG. 3 can be folded (for example, the angle between the two ends of the bent portion can be changed from ⁇ to ⁇ ) to obtain the electronic device 100 in the folded state
  • the electronic device 100 in the folded state shown in FIG. 4 can also be folded (for example, the angle between the two ends of the bent part can be changed from ⁇ to ⁇ ) to obtain the electronic device 100 in the folded state, for example, as shown in FIG. 5 .
  • FIG. 5 is a schematic diagram of a folded state of an electronic device provided by an embodiment of the present application.
  • 5(A) shows the fifth area 302 and the sixth area 303 of the electronic device 100 in the folded state.
  • (B) of FIG. 5 shows the fourth area 301 and the fifth area 302 of the electronic device 100 in the folded state.
  • the electronic device 100 when the included angle ⁇ between the two ends of the bending portion is 0 degrees, the electronic device 100 may be in a folded state.
  • the electronic device 100 shown in FIG. 5(B) may be obtained by turning the electronic device 100 shown in FIG. 5(A) 90 degrees around the central axis.
  • the first display screen 200 and the second display screen 300 may be different display areas of the same flexible folding screen.
  • the first display area 201, the second display area 202, the third area 203, the fourth area 301, the fifth area 302, and the sixth area 303 may be different areas on the flexible folding screen, all of which are used for Displays a graphical user interface (GUI), which is subsequently collectively referred to as the user interface.
  • GUI graphical user interface
  • the first display screen 200 may be one flexible folding screen of the electronic device 100
  • the second display screen 300 may be another flexible folding screen of the electronic device 100
  • the first display area 201, the second display area 202, and the third area 203 may be different areas on the above-mentioned one flexible folding screen, all of which are used to display the user interface.
  • the fourth area 301, the fifth area 302, and the sixth area 303 may be different areas on the above-mentioned another flexible folding screen, all of which are used for displaying a user interface.
  • the first display screen 200 and the second display screen 300 may be formed by splicing multiple screens.
  • the first display screen 200 may be formed by splicing two rigid screens and one flexible folding screen.
  • the first display area 201 and the second display area 202 may be the display areas on the above two rigid screens, respectively, and the third area 203 may be the display area on the above one flexible folding screen.
  • the second display screen 300 is similar to the first display screen 200 and will not be repeated here.
  • the first display screen 200 may be formed by splicing two rigid screens and a chain connecting the two rigid screens.
  • the first display area 201 and the second display area 202 are the display areas on the two rigid screens, respectively, and the third area 203 is the above-mentioned components such as chains connecting the two rigid screens, and is not a display area.
  • the second display screen 300 is similar to the first display screen 200 and will not be repeated here.
  • the second display screen 300 is a rigid screen.
  • the sixth area 303 is the display area on the rigid screen, and the fourth area 301 and the fifth area 302 are not the display areas on the rigid screen.
  • the first display screen 200 is similar to the second display screen 300 and will not be described again.
  • the first display screen 200 is a flexible folding screen and the second display screen 300 is a rigid screen as an example for description.
  • the first display screen 200 displays a user interface through a first display area 201 , a second display area 202 and a third area 203 .
  • the second display screen 300 displays the user interface through the sixth area 303, and the fourth area 301 and the fifth area 302 are not display areas.
  • the electronic device 100 When the electronic device 100 is in a bent state, if a touch operation acting on the first display screen is detected, the electronic device 100 may display a preset interface on the second display screen.
  • the above-mentioned bent state may be a physical state obtained by the electronic device 100 in the unfolded state after receiving the folding operation, and specific examples are shown in FIGS. 6-7 .
  • the above-mentioned bent state may also be a physical state obtained by the electronic device 100 in the folded state receiving the unfolding operation, and specific examples are shown in FIGS. 8-9 .
  • the above-mentioned bent state may also be the original physical state of the electronic device 100 , which is not limited in this embodiment of the present application.
  • For the detection of the above-mentioned bending state reference may be made to the description in the above-mentioned FIG. 1 .
  • FIG. 6 exemplarily shows a comparison diagram before and after the electronic device 100 displays a preset interface.
  • 6(A) shows the electronic device 100 in an unfolded state before displaying the preset interface
  • FIG. 6(B) shows the electronic device 100 in a bent state after displaying the preset interface.
  • the electronic device 100 displays a user interface 610 on the first display screen 200 , and the user can watch the video in the video application through the user interface 610 . At this time, the electronic device 100 may be in an unfolded state.
  • the electronic device 100 may be in an unfolded state.
  • the user can fold the electronic device 100 in the unfolded state shown in FIG. 6(A) and FIG. 7(A) (that is, the angle between the two ends of the folded part can be changed from ⁇ to ⁇ ) to obtain a curved shape.
  • the electronic device 100 in the folded state may sandwich a finger (eg, thumb) between the first display area 201 and the second display area 202, that is, to perform the above-mentioned touch operation on the first display screen.
  • the electronic device 100 may display the user interface 620 shown in (B) of FIG. 6 .
  • the electronic device 100 may display the user interface 620 on the sixth area 303 of the second display screen 300 . At this time, the electronic device 100 is in a bent state.
  • User interface 620 may include an electronic card: payment code 621 of the first payment application. The user can perform a payment operation through the payment code 621 of the first payment application.
  • the electronic device 100 can cancel the display of the user interface 620 .
  • the user pulls out the finger sandwiched between the first display area 201 and the second display area 202, and unfolds the electronic device to the unfolded state.
  • the electronic device 100 may redisplay the user interface 610 or other user interfaces of the above video application on the first display screen 200 . The user can quickly and conveniently return to the user interface 610 before payment to continue watching the video, and the user experience is better.
  • FIG. 8 exemplarily shows a comparison diagram before and after the electronic device 100 displays a preset interface.
  • 8(A) shows the electronic device 100 in a folded state before displaying the preset interface
  • FIG. 8(B) shows the electronic device 100 in a folded state after displaying the preset interface.
  • the electronic device 100 may be in a locked screen state, and the user interface 800 is displayed on the sixth area 303 . At this time, the electronic device 100 may be in a folded state, and for details, please refer to the side view of the electronic device 100 shown in (A) of FIG. 9 .
  • the user can partially unfold the electronic device 100 in the folded state shown in FIG. 8(A) and FIG. 9(A) (that is, the angle between the two ends of the folded part can be transformed from ⁇ to ⁇ ) to obtain The electronic device 100 in a bent state.
  • the user may sandwich a finger (eg, thumb) between the first display area 201 and the second display area 202, that is, to perform the above-mentioned touch operation on the first display screen.
  • the electronic device 100 may display the user interface 620 shown in (B) of FIG. 8 .
  • the physical state of the electronic device 100 is as shown in (B) of FIG. 9 .
  • (B) of FIG. 8 and (B) of FIG. 9 are the same as those of (B) of FIG. 6 and (B) of FIG. 7 , and will not be described again.
  • the user's finger eg, the thumb
  • the user's finger is sandwiched between the first display area 201 and the second display area 202
  • the user's finger is in contact with the first display screen 200
  • the first touch area is the area where the pulp of the user's finger contacts the first display area 201
  • the second touch area is the area where the back of the user's finger contacts the second display area 202 .
  • Schematic diagrams of the first touch area and the second touch area in the first display screen 200 are specifically shown in FIGS. 10A-10D .
  • FIG. 10A exemplarily shows a schematic diagram of a first display screen 200 .
  • the first display screen 200 may be provided with a capacitive pressure sensor and/or a touch sensor, and any small grid area on the first display screen 200 may be provided with electrodes.
  • the electronic device 100 detects a touch operation acting on the first display screen 200 , the capacitance between electrodes in at least one small grid area included in the touch area of the touch operation will change. That is to say, any small grid area on the first display screen 200 may correspond to a capacitance change value.
  • FIG. 10A is described by taking an example that the plane where the first display screen 200 is located is a two-dimensional plane.
  • the small grid area in the lower left corner of the first display screen 200 is the coordinate origin (0,0)
  • the axis where the central axis is located is the y axis
  • the axis perpendicular to the axis where the central axis is located is the x axis.
  • Any small grid area on the first display screen 200 may correspond to a coordinate point in a two-dimensional coordinate system.
  • FIGS. 10B to 10D the above-mentioned first touch area and second touch area are introduced through FIGS. 10B to 10D .
  • FIG. 10B exemplarily shows a schematic diagram of a capacitance change value of the first display screen 200 .
  • Any coordinate point in the two-dimensional coordinate system shown in FIG. 10B may correspond to a small grid area, and the small grid area may correspond to a capacitance change value.
  • the capacitance change value of the small grid area corresponding to the coordinate origin (0,0) is -13.
  • the capacitance change value of the small grid area corresponding to point A (x 1 , 0) is -1.
  • the capacitance change value of the small grid area corresponding to point B (x 2 ,0) is -5.
  • the capacitance change value of the small grid area corresponding to point C (x 3 , 0) is 7.
  • the capacitance change value of the small grid area corresponding to point D (0, y 1 ) is 4.
  • the coordinate value of any coordinate point in the two-dimensional coordinate system shown in FIG. 10B in the x-axis direction is greater than or equal to 0 and less than or equal to x 3
  • the coordinate value in the y-axis direction is greater than or equal to 0 and less than or equal to y 1 .
  • the small grid area corresponding to the coordinate point may be within the second display area 202 .
  • the small grid area corresponding to the coordinate point may be within the third area 203 .
  • the small grid area corresponding to the coordinate point may be within the first display area 201 .
  • the electronic device 100 can obtain information such as the number, position, area, and shape of the touch area for the touch operation according to the capacitance change value of the first display screen 200 .
  • the electronic device 100 can obtain the first touch area and the second touch area circled in FIG. 10B .
  • the first touch area is located in the first display area 201
  • the second touch area is located in the second display area 202 .
  • the area of the first touch area is larger than that of the second touch area.
  • the shape of the first touch area is an ellipse
  • the shape of the second touch area is a circle.
  • the projection of the first touch area on the central axis and the projection of the second touch area on the central axis partially overlap, as shown in FIG. 10C .
  • the electronic device 100 may identify the first touch area with an oval shape and a larger area as the area touched by the user's finger pad, and may also identify the second touch area with a circular shape or a smaller area. The area where the back of the user's finger touches.
  • the shape of the first touch area is an ellipse, and the long axis of the ellipse is more inclined to the x-axis, so it can be obtained that the direction corresponding to the first touch area is the x-axis direction.
  • the long axis of the ellipse may also be more inclined to the y-axis, and it can be obtained that the direction corresponding to the first touch area is the y-axis direction.
  • FIG. 10D only shows a schematic diagram of capacitance change values of the first touch area and surrounding small grid areas, and the capacitance change values of the unshown small grid areas may be consistent with FIG. 10B .
  • the above-mentioned axis of the long axis deviation can be obtained according to the magnitude relationship between the angle between the long axis and the x axis (referred to as the first angle) and the angle between the long axis and the y axis (referred to as the second angle). of.
  • the first angle is greater than the second angle
  • the long axis is more inclined to the y-axis
  • the first angle is smaller than the second angle
  • the long axis is more inclined to the x-axis.
  • the direction of the long axis may be any one of the x-axis direction and the y-axis direction.
  • the shape of the first touch area and the second touch area is a predetermined shape with a long axis, such as a circle-like shape, an ellipse-like shape, an ellipse, a circle, etc.
  • the setting interface may be confirmed according to the direction corresponding to the first touch area or the direction corresponding to the second touch area.
  • the direction corresponding to the first touch area is the direction of the long axis of the preset shape of the first touch area
  • the direction corresponding to the second touch area is the direction of the long axis of the preset shape of the second touch area. direction.
  • the first touch area is the first touch area shown in FIG.
  • the preset interface displayed by the electronic device 100 may be The user interface 620 shown in (B) of FIG. 6 .
  • the preset interface displayed by the electronic device 100 may be as shown in FIG. 11 .
  • the user interface 1100 can display the electronic card of the train ticket, and the user can obtain the information of the train ticket through the user interface 1100 .
  • the preset interface displayed by the electronic device 100 may be based on the position of the first touch area in the first display area 201 and/or the position of the second touch area in the second display area 202 Confirmed, see Figure 12- Figure 13 for specific examples.
  • FIG. 12 exemplarily shows a schematic diagram of a first display area 201 .
  • the first display area 201 may include area one 2011 , area two 2012 , and area three 2013 .
  • the preset interface displayed by the electronic device 100 may be different.
  • the preset interface displayed by the electronic device 100 may be the user interface 620 shown in (B) of FIG. 6 .
  • the preset interface displayed by the electronic device 100 may be the user interface 1100 shown in FIG. 11 .
  • the preset interface displayed by the electronic device 100 may include an electronic card with a travel code.
  • the number of areas included in the first display area 201 may be more or less, which is not limited in this embodiment of the present application.
  • the first display area 201 includes the three areas shown in FIG. 12 as an example for description.
  • the electronic device 100 when the electronic device 100 displays the preset interface, the position of the first touch area in the first display area 201 can be changed.
  • the preset interface displayed by the electronic device 100 can also be changed. For example, as shown in FIG. 13 , the user's finger may slide from the area one 2011 to the area two 2012. In response to the sliding operation, the electronic device 100 may change the displayed preset interface from the user interface 620 shown in (B) of FIG. 6 . Switch to the user interface 1100 shown in FIG. 11 .
  • the electronic device 100 may display a plurality of labels on the display screen. Different labels can be used to identify different preset interfaces. The user can confirm the corresponding preset interface according to the label, and select the preset interface displayed according to his own needs. For a specific example, see FIG. 14 .
  • FIG. 14 exemplarily shows a schematic diagram of a preset interface.
  • the user interface 140 may include a payment code 141 , a payment code label 142 , a transportation ticket label 143 , and a personal certificate label 144 of the first payment application.
  • the payment code label 142 may be used to identify a preset interface displayed when the first touch area is located in the area one 2011, such as the user interface 140.
  • the transportation ticket label 213 may be used to identify a preset interface displayed when the first touch area is located in the area two 2012, such as the user interface 1100 shown in FIG. 11 .
  • the personal ID label 144 can be used to identify a preset interface displayed when the first touch area is located in the area three 2013, for example, a preset interface displaying an electronic card of an ID card.
  • a label identifying the preset interface may also be displayed on the first display area 201, which is not limited in this embodiment of the present application.
  • the preset interface only includes one electronic card.
  • the preset interface may also include multiple electronic cards, and the electronic device may display the multiple electronic cards on the preset interface in a preset display manner.
  • the entire content of one card among the multiple electronic cards (the card may be referred to as the preferred card in the future) is displayed on the preset interface, and the electronic cards except the preferred card display part of the content.
  • FIG. 15 and FIG. 17 display the user interface.
  • the entire contents of multiple electronic cards are displayed on the preset interface, and the multiple electronic cards are located at different positions on the preset interface. For specific examples, see the user interfaces shown in FIGS. 19-20 .
  • FIG. 15 exemplarily shows a comparison diagram before and after a user slides a finger.
  • a schematic diagram of the user operation of sliding a finger is shown in FIG. 16 .
  • 15(A) shows the preset interface before the user slides the finger
  • FIG. 15(B) shows the preset interface after the user slides the finger.
  • the user interface 150 displayed on the sixth area 303 of the electronic device 100 includes four electronic cards: membership card 151 , access card 152 , transportation card 153 , and bank card 154 .
  • the user interface 150 displays the entire contents of the membership card 151 , and displays part of the contents of the access card 152 , the transportation card 153 , and the bank card 154 . That is, the preferred card in the user interface 150 shown in FIG. 15(A) is the membership card 151 .
  • the electronic device 100 when the electronic device 100 displays the user interface 150 shown in (A) of FIG. 15 , the user's finger can slide down once in the first display area 201 (specifically, as shown in FIG. 16 ). In response to the sliding operation, the electronic device 100 can switch the preferred card displayed on the user interface 150 from the membership card 151 to the access card 152, that is, the user interface 150 shown in (B) of FIG. 15 is displayed.
  • the electronic device 100 may also switch the positions of the four electronic cards displayed on the user interface 150 .
  • the positions of the four electronic cards in the fourth interface 150 shown in (A) of FIG. 15 are from bottom to top: membership card 151 , access card 152 , transportation card 153 , and bank card 154 . After switching, the positions of the four electronic cards in the fourth interface 150 shown in (B) of FIG.
  • the preferred card switched by the electronic device 100 may also be different. For example, when the user's finger slides upward, the electronic device 100 can switch the preferred card displayed on the user interface 150 from the membership card 151 to the bank card 154, as shown in FIGS. 17-18 .
  • FIG. 17 exemplarily shows another comparison diagram before and after a user slides a finger.
  • a schematic diagram of the user operation of sliding a finger is shown in FIG. 18 .
  • 17(A) shows the preset interface before the user slides the finger
  • FIG. 17(B) shows the preset interface after the user slides the finger.
  • (A) of FIG. 17 is the same as that of FIG. 15 (A), and will not be repeated.
  • the electronic device 100 displays the user interface 150 shown in (A) of FIG. 17
  • the user's finger can slide up in the first display area 201 once (specifically, as shown in FIG. 18 ).
  • the electronic device 100 can switch the preferred card displayed on the user interface 150 from the membership card 151 to the bank card 154, that is, the user interface 150 shown in (B) of FIG. 17 is displayed.
  • the manner in which the electronic device 100 switches the positions of the plurality of electronic cards displayed on the preset interface may also be different.
  • the sliding direction of the user's finger in Figures 15-16 is downward, and the sliding direction of the user's finger in Figures 17-18 is upward.
  • the positions of the four electronic cards in the user interface 150 shown in (B) of FIG. 15 are different from the positions of the four electronic cards in the user interface 150 shown in (B) of FIG. 17 .
  • the positions of the four electronic cards in the fourth interface 150 shown in (B) of FIG. 17 are, from bottom to top, a bank card 154 , a membership card 151 , an access card 152 , and a transportation card 153 .
  • FIG. 19 exemplarily shows a comparison diagram before and after the user clicks on the screen.
  • 19(A) shows the preset interface before the user clicks the screen
  • FIG. 19(B) shows the preset interface after the user clicks the screen.
  • the user interface 190 displayed by the electronic device 100 on the sixth area 303 includes two electronic cards: an access card 152 and a traffic card 153 .
  • the user interface 190 displays the entire contents of the access card 152 and the traffic card 153 , and the access card 152 is located on the right side of the traffic card 153 .
  • the electronic device 100 when the electronic device 100 displays the user interface 190 shown in (A) of FIG. 19 , the user's finger may click on the first display area 201 once. In response to the click operation, the electronic device 100 can switch the positions of the access card 152 and the traffic card 153 displayed on the user interface 190, that is, display the user interface 190 shown in (B) of FIG. 19 . In the user interface 190 shown in (B) of FIG. 19 , the access card 152 is located on the left side of the traffic card 153 .
  • FIG. 20 exemplarily shows a comparison diagram before and after a user slides a finger.
  • (A) of FIG. 20 shows the preset interface before the user slides the finger
  • (B) of FIG. 20 shows the preset interface after the user slides the finger.
  • the user interface 2000 displayed by the electronic device 100 on the sixth area 303 includes two electronic cards: a payment code 2001 of the first payment application and a payment code 2002 of the second payment application.
  • the user interface 2000 displays the entire contents of the payment code 2001 of the first payment application and the payment code 2002 of the second payment application, and the payment code 2001 of the first payment application is located on the upper side of the payment code 2002 of the second payment application.
  • the electronic device 100 when the electronic device 100 displays the user interface 2000 shown in (A) of FIG. 20 , the user's finger may slide on the first display area 201 once. In response to the sliding operation, the electronic device 100 can switch the positions of the payment code 2001 of the first payment application and the payment code 2002 of the second payment application displayed on the user interface 2000 , that is, the user interface 2000 shown in (B) of FIG. 20 is displayed. . In the user interface 2000 shown in (B) of FIG. 20 , the payment code 2001 of the first payment application is located on the lower side of the payment code 2002 of the second payment application.
  • the preferred card may also include multiple electronic cards.
  • the outline of the access card 152 and the outline of the transportation card 153 in the user interface 190 may also partially overlap without affecting the display of the card content. This embodiment of the present application does not limit this.
  • the operation of clicking the screen in FIG. 19 may also be clicking the access control card 152, the traffic card 153 or any area displayed on the second display screen 300. This is not limited in this embodiment of the present application.
  • the area of the second display screen 300 on which the electronic device 100 displays the preset interface may be preset by the system, may be customized in response to user operations, or may be confirmed according to preset rules, This embodiment of the present application does not limit this.
  • An example of confirmation by the electronic device 100 according to the preset rules is as follows:
  • the electronic device 100 first confirms the first touch area with an oval shape and a larger area, and identifies it as the area touched by the finger pad of the user. Then confirm the area of the first display screen 200 where the first touch area is located, and finally confirm that the area of the second display screen 200 opposite to the above-mentioned area of the first display screen 200 is used for displaying the preset interface.
  • the first touch area is located in the first display area 201
  • the fourth area 301 opposite to the first display area 201 is used for displaying a preset interface.
  • the electronic device 100 first confirms the second touch area with a circular shape and/or a smaller area, and identifies it as the area touched by the back of the user's finger. Then confirm the area of the first display screen 200 where the second touch area is located, and finally confirm that the area of the second display screen 200 opposite to the above-mentioned area of the first display screen 200 is used for displaying the preset interface.
  • the second touch area is located in the second display area 202 , and the six areas 303 opposite to the second display area 202 are used to display the preset interface.
  • the electronic card included in the preset interface may be preset by the system, may also be customized in response to user operations, or may be confirmed according to preset rules.
  • preset rules examples of user-defined settings are shown in Figure 21- Figure 25 below.
  • An example of confirmation by the electronic device 100 according to the preset rules is as follows:
  • the above-mentioned preset rule may be: an electronic card whose number of times the user uses within a preset time is greater than a preset threshold is an electronic card included in the preset interface.
  • the above-mentioned preset rule may be: electronic cards such as membership cards and bank cards that have preferential activities are electronic cards included in the preset interface.
  • the above preset rule may be: an electronic card that matches scene information such as the location, time, and application of the user before displaying the preset interface of the electronic device 100 is an electronic card included in the preset interface.
  • scene information such as the location, time, and application of the user before displaying the preset interface of the electronic device 100
  • the preset interface displayed at this time may include the electronic card of the air ticket, the electronic card of the ID card, and the like.
  • FIG. 21 exemplarily shows a schematic diagram of an embodiment of a user interface.
  • the user interface 2100 may include a first setting interface 2110 and a picture guide 2120 . in:
  • the first setting interface 2110 may include an overall control function option 2111 , an area association option 2112 , a direction association option 2113 , a direction setting option 2114 , a switch function option 2115 , and a refresh function option 2116 . in:
  • the general control function option 2111 can be used by the user to enable or disable the function of displaying the shortcut interface.
  • the electronic device 100 may detect a user operation (eg, a click or slide operation) acting on the master control function option 2111, and in response to the operation, the electronic device 100 may turn on or off the function of displaying the shortcut interface.
  • the user interface 2100 may display the above-mentioned area-related function options 2112 , direction-related function options 2113 , direction setting options 2114 , switching function options 2115 , and refresh function options 2116 . Otherwise, the first setting interface 2110 may only display the master control function option 2111 .
  • the electronic device 100 After the function of displaying the shortcut interface is enabled, when the electronic device 100 is in a bent state, if a touch operation acting on the first display screen 200 is detected, the electronic device 100 can display a preset interface on the second display screen 300 . Specifically Examples are shown in Figures 6-9, 10A-10D, and 11-20 above.
  • the area associated function option 2112 can be used by the user to enable or disable the function of the shortcut interface associated with the inner screen area, where the inner screen is the first display screen 200 and the inner screen area is the first display area 201 and/or the second display area 202 .
  • the electronic device 100 may detect a user operation (eg, a click or a swipe operation) acting on the area associated function option 2112, and in response to the operation, the electronic device 100 may enable or disable the function of the area associated shortcut interface on the internal screen.
  • a user operation eg, a click or a swipe operation
  • the electronic device 100 may establish the position of the first touch area in the first display area 201 and/or the position of the second touch area in the second display area 202
  • the mapping relationship between the location and the preset interface, and the user interface 2100 can display area options (area options 2117 shown in Figure 23 below).
  • the user interface 2100 may display the above-mentioned direction associated function option 2113 and direction setting option 2114 .
  • the electronic device 100 can display the corresponding display according to the position of the first touch area in the first display area 201 and/or the position of the second touch area in the second display area 202 , the specific example is shown in Figure 12-13 above.
  • the electronic device 100 establishes a mapping relationship between the position of the first touch area in the first display area 201 and the preset interface as an example when the user activates the function of the shortcut interface associated with the internal screen area as an example.
  • the direction association function option 2113 can be used by the user to enable or disable the function of the finger direction association shortcut interface.
  • the electronic device 100 may detect a user operation (such as a click or slide operation) acting on the direction association function option 2113, and in response to the operation, the electronic device 100 may enable or disable the function of the finger direction association shortcut interface.
  • the electronic device 100 can establish a mapping relationship between the direction corresponding to the first touch area and/or the direction corresponding to the second touch area and the preset interface, and the user interface 2100 can display Orientation setting option 2114 above.
  • the electronic device 100 can display the corresponding preset interface according to the direction corresponding to the first touch area and/or the direction corresponding to the second touch area.
  • the preset interface may be the illustrated description confirmed according to the direction corresponding to the first touch area or the direction corresponding to the second touch area.
  • the electronic device 100 establishes the mapping relationship between the direction corresponding to the first touch area and the preset interface as an example when the user activates the function of the finger direction associating the shortcut interface.
  • Orientation setting options 2114 may include a portrait option 2114A and a landscape option 2114B.
  • the vertical option 2114A can be used for the preset interface displayed when the user selects that the direction corresponding to the first touch area is the axis where the central axis is located (ie, the y-axis direction shown in FIGS. 10A-10D ).
  • the horizontal option 2114B can be used for the preset interface displayed when the user selects the direction corresponding to the first touch area as the axial direction of a straight line perpendicular to the central axis (ie, the x-axis direction shown in FIGS. 10A-10D ).
  • the switch function option 2115 can be used by the user to enable or disable the function of sliding a finger to switch electronic cards.
  • the electronic device 100 may detect a user operation (eg, a tap or swipe operation) acting on the switching function option 2115, and in response to the operation, the electronic device 100 may turn on or off the function of sliding a finger to switch electronic cards.
  • a user operation eg, a tap or swipe operation
  • the electronic device 100 may turn on or off the function of sliding a finger to switch electronic cards.
  • the preset interface includes an electronic card and the function of the shortcut interface associated with the inner screen area is enabled, after the function of sliding the finger to switch the electronic card is enabled, the electronic device 100 can switch the displayed preset interface according to the user operation of sliding the finger, A specific example is shown in Figure 13.
  • the electronic device 100 can switch the position of the electronic cards displayed on the preset interface according to the sliding operation. , as shown in Figure 20. It can be understood that the switch function option 2115 can also be used by the user to enable or disable other operations (eg, click operations) to switch the electronic card function.
  • the electronic device 100 can switch the position of the electronic card displayed on the preset interface according to the other operation, and a specific example is shown in FIG. 19 .
  • the refresh function option 2116 can be used by the user to enable or disable the function of finger tap to refresh the electronic card.
  • the electronic device 100 may detect a user operation (eg, a tap or swipe operation) acting on the refresh function option 2116, and in response to the operation, the electronic device 100 may turn on or off the function of finger tap to refresh the electronic card. After the function of refreshing the electronic card by clicking with the finger is enabled, the electronic device 100 can refresh the electronic card displayed on the preset interface according to the clicking operation.
  • a user operation eg, a tap or swipe operation
  • Picture guide 2120 may include picture examples 2121 and learn more options 2122.
  • the picture example 2121 may include multiple pictures, and the user may obtain a usage example of the function of displaying the shortcut interface through the above-mentioned multiple pictures.
  • Learn more option 2122 may be used by the user to select to view more image examples.
  • the electronic device 100 may detect a user operation (eg, a click operation) acting on the learn more option 2122, and in response to the operation, the electronic device 100 may display a detailed picture example.
  • the above usage example and the above detailed picture example may include any one of the embodiments shown in FIGS. 6-9, 10A-10D, and 11-20 above.
  • the user may click the vertical option 2114A or the horizontal option 2114B in the user interface 2100 to select the type of electronic card displayed in the preset interface, a specific example is shown in FIG. 22 .
  • the user interface 2200 may include a first settings interface 2110 and a first selection application interface 2210 .
  • the first setting interface 2110 compared with the first setting interface 2110 shown in FIG. 21, the vertical option 2114A is in a selected state.
  • the first selection application interface 2210 may include a first payment application option 2211, a second payment application option 2212, a card package application option 2213 and other functional options of multiple applications. in:
  • the first payment application option 2211 may be used for: the user selects or deselects the electronic card in the first payment application as the electronic card included in the preset interface.
  • the electronic device 100 may detect a user operation (eg, a click operation) acting on the first payment application option 2211, and in response to the operation, the electronic device 100 may establish a corresponding relationship between the electronic card in the first payment application and the preset interface.
  • the preset interface displayed by the electronic device 100 may include the electronic card in the first payment application.
  • the electronic device 100 may detect a user operation (such as a click operation) acting on any one of the payment code option 2211A, the payment code option 2211B, and the two-dimensional code business card option 2211C under the first payment application option 2211, In response to this operation, the electronic device 100 may establish a corresponding relationship between the electronic card of the option and the preset interface.
  • a user operation such as a click operation
  • the second payment application option 2212 and the payment code option 2212A, the payment code option 2212B, the QR code business card option 2212C under the second payment application option 2212, and the first payment application option 2211 and the payment under the first payment application option 2211 The descriptions of the code option 2211A, the payment code option 2211B, and the two-dimensional code business card option 2211C are similar.
  • the description of the card package application option 2213 is similar to the description of the first payment application option 2211, and will not be repeated here.
  • the preset interface displayed by the electronic device 100 may include two electronic cards: the payment code of the first payment application and the payment code of the second payment application.
  • the preset interface displayed by the electronic device 100 may be the user interface 2000 shown in FIG. 20 above.
  • the electronic device 100 may display the area option 2117, so as to allow the user to select: when the first touch area is in a different position of the first display area 201, a preset interface
  • the type of electronic card displayed, a specific example is shown in Figure 23.
  • User interface 2300 may include a first settings interface 2110 and an area example interface 2310.
  • the first setting interface 2110 may be a user interface displayed by the electronic device 100 after the user enables the region associated function option 2112 in the first setting interface 2110 shown in FIG. 21 .
  • the first setting interface 2110 shown in FIG. 23 may not include the direction associated function option 2113 , the direction setting option 2114 , and may further include the area option 2117 .
  • the area options 2117 may include area one option 21171 , area two option 21172 , and area three option 21173 .
  • the area example interface 2310 may include interface examples of area one 2011 , area two 2012 , and area three 2013 .
  • the area one option 21171 can be used for the user to select: when the first touch area is located in the area one 2011, the type of the electronic card displayed on the preset interface.
  • Area 2 option 21172 can be used for user selection: when the first touch area is located in area 2 2012, the corresponding electronic card type displayed on the preset interface
  • area 3 option 21173 can be used for user selection: the first touch area is located in area 2012 3. In 2013, the type of electronic card displayed on the preset interface.
  • the electronic device 100 may detect a user operation (eg, a click operation) acting on any one of the area options 2117, and in response to the operation, the electronic device 100 may display a user interface for selecting an application. For example, when a click operation acting on the area two option 21172 is detected, in response to the click operation, the user interface displayed by the electronic device 100 can be used for the user to select: when the first touch area is located in the area two 2012, the preset interface displayed Type of electronic card.
  • a user operation eg, a click operation acting on any one of the area options 2117
  • any of the area options 2117 may also include an orientation setting option.
  • the area one option 21171 may also include a portrait option 21171A and a landscape option 21171B.
  • the vertical option 21171A can be used for the user to select: when the first touch area is located in the area one 2011 and the corresponding direction is the axis where the central axis is located, the type of the electronic card displayed on the preset interface.
  • the horizontal option 21171B can be used by the user to select: when the first touch area is located in area one 2011 and the corresponding direction is the axis where the line perpendicular to the central axis is located, the type of electronic card displayed on the default interface, a specific example is shown in Figure 24- Figure 25 shown.
  • User interface 2400 may include a first settings interface 2110 and a second selection application interface 2410 .
  • the first setting interface 2110 compared with the first setting interface 2110 shown in FIG. 23, the vertical option 21171A in the area one option 21171 is selected.
  • the second selection application interface 2410 may include a first payment application option 2411, a second payment application option 2412, a card package application option 2413, a bank application option 2414, a ticket purchase application option 2415 and other functional options of multiple applications.
  • the electronic device 100 may detect a user operation (such as a click operation) acting on any option in the second selection application interface 2410, and in response to the operation, the electronic device 100 may display the electronic card under the option, as shown in FIG. 25 below
  • the electronic device 100 may detect a click operation acting on the first payment application option 2411 , and in response to the click operation, the electronic device 100 may display a third selection application interface 2510 .
  • the third selection application interface 2510 may be used by the user to select an electronic card included in the first payment application.
  • the third selection application interface 2510 may include a payment code option 2511 , a payment code option 2512 , and a two-dimensional code business card 2513 .
  • the electronic device 100 can detect a user operation (such as a click operation) acting on any one option in the third selection application interface 2510, and in response to the operation, the electronic device 100 can establish the electronic card in the first payment application and the preset interface mapping relationship.
  • FIG. 26 is a display method provided by an embodiment of the present application.
  • the method can be applied to the electronic device 100 shown in FIGS. 1 and 2 .
  • the method can also be applied to the electronic device 100 shown in FIGS. 3-5 .
  • the method includes but is not limited to the following steps:
  • S110 The electronic device detects a touch operation acting on the first display screen.
  • the electronic device when the electronic device detects the touch operation acting on the first display screen, the electronic device is in a bent state.
  • the electronic device is a foldable electronic device.
  • the electronic device includes a first display screen and a second display screen, and the first display screen includes a first display area and a second display area.
  • the first display screen and the second display screen please refer to the description of the first display screen 200 and the second display screen 300 in FIG. 3 to FIG. 5
  • the description of the first display area and the second display area please refer to FIG. 3 - Illustration of the first display area 201 and the second display area 202 in FIG. 5 .
  • the preset interval can be (0°, 180°).
  • the preset interval may be (0°, 90°), (0°, 60°), (0°, 45°), (0°, 30°), (0°, 15°), ( 15°, 30°), (15°, 45°), (15°, 60°), (15°, 90°), etc.
  • the above-mentioned folded state may be a physical state obtained by the electronic device in the unfolded state receiving a folding operation transformation, and specific examples are shown in the above-mentioned FIGS. 6-7 .
  • the above-mentioned bent state may also be a physical state obtained by the electronic device in the folded state receiving the unfolding operation transformation, and specific examples are shown in the above-mentioned FIGS. 8-9 .
  • the above-mentioned bending state may also be a physical state that the electronic device is in originally, which is not limited in this embodiment of the present application.
  • the above-mentioned touch operation acting on the first display screen acts on the first touch area of the first display area, and also acts on the second touch area of the second display area.
  • the touch operation is an operation in which the user sandwiches a finger (eg, a thumb) between the first display area 201 and the second display area 202 as shown in FIGS. 6-9 above.
  • the area where the pulp of the user's finger contacts the first display area 201 is the first touch area
  • the area where the back of the user's finger contacts the second display area 202 is the second touch area.
  • the first touch area and the second touch area satisfy at least one of the following: the projection of the first touch area on the central axis and the projection of the second touch area on the central axis at least partially overlap,
  • the area of the first touch area is larger than the area of the second touch area, the area of the first touch area or the second touch area is within a predetermined range, and the shape of the first touch area or the second touch area is predetermined. set shape.
  • the preset shape is a shape with a long axis, such as a circle-like shape, an ellipse-like shape, a circle, and an ellipse.
  • the electronic device In response to the touch operation, the electronic device displays a preset interface on the second display screen.
  • the direction corresponding to the first touch area and/or the direction corresponding to the second touch area and the preset interface There is a mapping relationship.
  • the direction corresponding to the first touch area is the direction of the long axis of the preset shape of the first touch area
  • the direction corresponding to the second touch area is the direction of the long axis of the preset shape of the second touch area. direction.
  • the preset interface displayed by the electronic device 100 may be the illustrated description confirmed according to the direction corresponding to the first touch area or the direction corresponding to the second touch area.
  • mapping relationship between the position of the first touch area in the first display area 201 and/or the position of the second touch area in the second display area 202 and the preset interface.
  • the preset interface displayed on the second display screen is the first interface.
  • the preset interface displayed on the second display screen is the second interface.
  • the first position is in the area 1 2011 of the first display area 201 shown in FIGS. 12-13
  • the third position is in the area 2012 of the first display area 201 shown in FIGS. 12-13 .
  • the projection of the second touch area at the second position on the central axis partially overlaps the projection of the first touch area at the first position on the central axis.
  • the projection of the second touch area located at the fourth position on the central axis partially overlaps the projection of the first touch area located at the third position on the central axis.
  • the first interface may be the user interface 620 shown in FIG. 6 and FIG. 8
  • the second interface may be the user interface 1100 shown in FIG. 11 .
  • mapping relationships can be preset by the system, or can be customized in response to user operations, or can be confirmed according to preset rules.
  • preset rules please refer to the implementation shown in FIG. 21-FIG. 25. example.
  • the electronic device 100 may display multiple labels on the display screen. Different labels can be used to identify different preset interfaces. The user can confirm the corresponding preset interface according to the label, and select the displayed preset interface according to his own needs. A specific example is shown in FIG. 14 .
  • the preset interface may display one or more electronic cards.
  • Electronic cards can be, but are not limited to, QR codes such as payment codes, travel codes, personal business cards, barcodes such as payment codes, order codes, and express tracking numbers, personal documents such as ID cards and social security cards, and transportation such as air tickets and train tickets.
  • Preset information such as ticket, movie ticket, attraction ticket, bank card, membership card, or a record message in the memo.
  • the preset interface displays an electronic card
  • the preset interface may be based on the position of the first touch area in the first display area 201, and/or the second touch area in the second The position in the display area 202 is determined.
  • the electronic device may also switch the displayed preset interface according to a user operation (eg, a sliding operation). Specific examples are shown in Figures 12-13.
  • the electronic device may display the multiple electronic cards on the preset interface in a preset display manner.
  • the plurality of electronic cards may include a first card and a second card, each of which may be one or more electronic cards.
  • the preset display mode can include:
  • the entire content of the first card is displayed on the preset interface, and part of the content of the second card is displayed.
  • the user interface 150 shown in FIG. 15 and FIG. 17 please refer to the user interface 150 shown in FIG. 15 and FIG. 17 .
  • the first card displayed on the preset interface is located at the fifth position, and the second card is located at the sixth position.
  • the fifth position is different from the sixth position.
  • the method may further include: when displaying the entire content of the first card on the preset interface and displaying part of the content of the second card, detecting a third operation; in response to the third operation, displaying on the preset interface The entire content of the second card, showing part of the content of the first card.
  • the third operation can act on any area on the first display screen and the second display screen. Specific examples are shown in Figures 15-18.
  • the method may further include: displaying the first card in the fifth position of the preset interface, and when displaying the second card in the sixth position of the preset interface, detecting a fourth operation; in response to the fourth operation, The first card is displayed in the sixth position of the preset interface, and the second card is displayed in the fifth position of the preset interface.
  • the fourth operation can act on any area on the first display screen and the second display screen. Specific examples are shown in Figures 15-20.
  • the method may further include: displaying an interface of the first application on the first display screen and/or the second display screen.
  • the method may further include: receiving a fifth operation; in response to the fifth operation, displaying the interface of the first application on the first display screen and/or the second display screen. Wherein, after the user performs the fifth operation, the electronic device is not in a bent state and/or the electronic device does not detect the above-mentioned touch operation acting on the first display screen.
  • the electronic device displays a video playing interface of the video application on the first display screen (the user interface 610 shown in FIG. 6).
  • the electronic device receives a fifth operation (as shown in FIGS. 6-7 , a user operation of pulling out the finger sandwiched between the first display area 201 and the second display area 202 and unfolding the electronic device to the unfolded state).
  • the electronic device resumes displaying the interface of the video application (the user interface 610 shown in FIG. 6 , or other interfaces such as the home page of the video application) on the first display screen.
  • the method may further include: collecting the biometric information of the user by the electronic device, and verifying the biometric information.
  • the electronic device executes S120 only when the biometric information verification is passed.
  • the biometric information may be, but is not limited to, face information, fingerprint information, voiceprint information, iris information, pulse information, heart rate information, gait information, and the like.
  • the electronic device is in a black screen state, a screen-off display state, or a screen-locked state.
  • the above-mentioned preset interface is a user interface that can only be accessed after the user passes the authentication.
  • the electronic device may acquire the user's face information through the cameras (cameras 3031 and 3032 shown in FIGS. 3-5 ) disposed on the second display screen 300 and perform face verification. If the face verification is passed, the electronic device executes S120. If the face verification fails, the electronic device can verify again. For example, obtain other biometric information of the user for verification, or prompt the user to perform verification operations such as password verification and graphic verification. If the re-verification is passed, the electronic device may execute S120.
  • the user can quickly open different preset interfaces through different touch operations in the first touch area and the second touch area, and can also switch the displayed preset interface and multiple electronic devices displayed in the preset interface according to the user operation. position of the card. Not only is the user's operation simple, but also the user's choice is wider, and the practicability of the electronic device is higher.
  • the angle sensor 180M detects the bending angle of the electronic device 100
  • the pressure sensor 180A detects the touch operation as an example for description.
  • the biometric information used for identity verification is face information.
  • the touch area used for determining the preset interface is the first touch area.
  • the display screen 194 is in a bent state.
  • the bent state may be transformed from the unfolded state or the folded state, or may be the original state.
  • the angle sensor 180M detects the bending angle of the display screen 194 .
  • the angle sensor 180M reports the bending angle of the display screen 194 to the processor 110 .
  • the processor 110 determines that the bending angle of the display screen 194 is within a preset interval, that is, determines that the electronic device 100 is in a bent state.
  • the pressure sensor 180A detects the touch operation on the first display screen.
  • the touch operation acts on the first touch area of the first display area and also acts on the second touch area of the second display area.
  • the pressure sensor 180A reports to the processor 110 the event of receiving the above-mentioned touch operation acting on the first display screen.
  • the above-mentioned events acting on the touch operation on the first display screen include information such as the position, area, and shape of the first touch area and the second touch area.
  • the processor 110 instructs the camera 193 to collect face information.
  • the camera 193 collects the face information of the user.
  • the camera 193 sends the collected face information to the processor 110 .
  • the processor 110 verifies that the face information collected by the camera 193 is passed.
  • the processor 110 determines the direction and/or location corresponding to the first touch area, and determines the direction and/or location corresponding to the first touch area according to the above-mentioned direction and/or position corresponding to the first touch area.
  • the location determines the default interface displayed.
  • the processor 110 instructs the display screen 194 to display the above confirmed preset interface.
  • the display screen 194 displays the above-mentioned confirmation preset interface.
  • the above 2-6 may correspond to S110 in FIG. 26 .
  • the above 11-13 may correspond to S120 in FIG. 26 .
  • the above 7-10 may correspond to the steps of collecting the biometric information of the user and verifying the biometric information in FIG. 26 .
  • the execution order of the above 7-10 and the above 11 is not limited, and can also be executed simultaneously.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product described above includes one or more computer instructions.
  • the computer program instructions described above are loaded and executed on a computer, the procedures or functions described above in accordance with the present application are produced in whole or in part.
  • the aforementioned computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the above-mentioned computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the above-mentioned computer instructions may be transmitted from a website site, computer, server or data center via wired communication. (eg coaxial cable, optical fiber, digital subscriber line) or wireless (eg infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the above-mentioned computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, etc. that includes one or more available media integrated.
  • the above-mentioned usable media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, digital versatile disc (DVD)), or semiconductor media (eg, solid state disk (SSD)) )Wait.
  • magnetic media eg, floppy disk, hard disk, magnetic tape
  • optical media eg, digital versatile disc (DVD)
  • semiconductor media eg, solid state disk (SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un mode de réalisation de la présente demande concerne un procédé d'affichage, qui est appliqué dans un dispositif électronique pliable, le dispositif électronique comprenant un premier écran d'affichage et un second écran d'affichage; lorsque le dispositif électronique est dans un état déplié, une face d'émission de lumière du premier écran d'affichage et une face d'émission de lumière du second écran d'affichage sont opposées; le premier écran d'affichage comprend une première zone d'affichage et une seconde zone d'affichage; et lorsque le dispositif électronique est dans un état incurvé, l'angle entre le plan dans lequel la première zone d'affichage est située et le plan dans lequel la seconde zone d'affichage est située est inférieur à 180 degrés; et le procédé comprend les étapes consistant à : détecter une opération de commande tactile effectuée sur le premier écran d'affichage, le dispositif électronique étant dans ledit état incurvé; l'opération de commande tactile est effectuée sur une première zone de commande tactile de la première zone d'affichage et est également effectuée sur une seconde zone de commande tactile de la seconde zone d'affichage; et en réponse à l'opération de commande tactile, afficher une interface prédéfinie sur le second écran d'affichage. En utilisant le mode de réalisation de la présente demande, une opération d'utilisateur pour vérifier une interface prédéfinie peut être grandement simplifiée, l'utilisation par l'utilisateur est plus commode, et une meilleure expérience est fournie.
PCT/CN2021/110431 2020-08-19 2021-08-04 Procédé d'affichage et dispositif électronique WO2022037408A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010847676.XA CN114173165B (zh) 2020-08-19 2020-08-19 一种显示方法及电子设备
CN202010847676.X 2020-08-19

Publications (1)

Publication Number Publication Date
WO2022037408A1 true WO2022037408A1 (fr) 2022-02-24

Family

ID=80323362

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/110431 WO2022037408A1 (fr) 2020-08-19 2021-08-04 Procédé d'affichage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114173165B (fr)
WO (1) WO2022037408A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127918A1 (en) * 2011-11-22 2013-05-23 Samsung Electronics Co., Ltd Flexible display apparatus and method of providing user interface by using the same
CN109889630A (zh) * 2019-01-11 2019-06-14 华为技术有限公司 显示方法及相关装置
CN110286972A (zh) * 2019-05-14 2019-09-27 华为技术有限公司 一种折叠屏显示应用的方法及电子设备
CN110543287A (zh) * 2019-08-01 2019-12-06 华为技术有限公司 一种屏幕显示方法及电子设备
CN110620833A (zh) * 2019-08-16 2019-12-27 华为技术有限公司 一种可折叠的电子设备的显示方法及电子设备
CN110839096A (zh) * 2019-08-30 2020-02-25 华为技术有限公司 一种具有折叠屏的设备的触控方法与折叠屏设备
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9354742B2 (en) * 2013-04-10 2016-05-31 Samsung Electronics Co., Ltd Foldable electronic device and method of managing visible regions thereof
CN109918165B (zh) * 2019-03-12 2022-07-01 北京小米移动软件有限公司 界面显示方法、装置和存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127918A1 (en) * 2011-11-22 2013-05-23 Samsung Electronics Co., Ltd Flexible display apparatus and method of providing user interface by using the same
CN109889630A (zh) * 2019-01-11 2019-06-14 华为技术有限公司 显示方法及相关装置
CN110286972A (zh) * 2019-05-14 2019-09-27 华为技术有限公司 一种折叠屏显示应用的方法及电子设备
CN110543287A (zh) * 2019-08-01 2019-12-06 华为技术有限公司 一种屏幕显示方法及电子设备
CN110620833A (zh) * 2019-08-16 2019-12-27 华为技术有限公司 一种可折叠的电子设备的显示方法及电子设备
CN110839096A (zh) * 2019-08-30 2020-02-25 华为技术有限公司 一种具有折叠屏的设备的触控方法与折叠屏设备
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备

Also Published As

Publication number Publication date
CN114173165B (zh) 2023-02-14
CN114173165A (zh) 2022-03-11

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
CN109889630B (zh) 显示方法及相关装置
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
US11687235B2 (en) Split-screen method and electronic device
US10402625B2 (en) Intelligent electronic device and method of operating the same
EP3846427B1 (fr) Procédé de commande et dispositif électronique
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
WO2021017901A1 (fr) Procédé d'affichage d'écran et dispositif électronique
US20210342044A1 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
WO2021063098A1 (fr) Procédé de réponse d'écran tactile, et dispositif électronique
CN111443836B (zh) 一种暂存应用界面的方法及电子设备
WO2021082564A1 (fr) Procédé d'invite d'opération et dispositif électronique
WO2021078032A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
WO2022052662A1 (fr) Procédé d'affichage et dispositif électronique
WO2022057512A1 (fr) Procédé et appareil à écran divisé et dispositif électronique
WO2022161119A1 (fr) Procédé d'affichage et dispositif électronique
WO2020107463A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
JP2022501739A (ja) スタイラスペン検出方法、システムおよび関連装置
WO2022160991A1 (fr) Procédé de commande d'autorisation et dispositif électronique
WO2022222752A1 (fr) Procédé d'affichage et appareil associé
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
CN113641271A (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2022022674A1 (fr) Procédé de disposition d'icône d'application et appareil associé
WO2022001279A1 (fr) Procédé de gestion de bureau inter-dispositifs, premier dispositif électronique et second dispositif électronique
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21857503

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21857503

Country of ref document: EP

Kind code of ref document: A1