WO2021155770A1 - 一种显示方法及电子设备 - Google Patents

一种显示方法及电子设备 Download PDF

Info

Publication number
WO2021155770A1
WO2021155770A1 PCT/CN2021/074694 CN2021074694W WO2021155770A1 WO 2021155770 A1 WO2021155770 A1 WO 2021155770A1 CN 2021074694 W CN2021074694 W CN 2021074694W WO 2021155770 A1 WO2021155770 A1 WO 2021155770A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
screen
image
area
display
Prior art date
Application number
PCT/CN2021/074694
Other languages
English (en)
French (fr)
Inventor
梁兵
李�真
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21750469.5A priority Critical patent/EP4086740A4/en
Publication of WO2021155770A1 publication Critical patent/WO2021155770A1/zh
Priority to US17/817,087 priority patent/US20220374118A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • This application relates to the field of terminal technology, and in particular to a display method and electronic equipment.
  • the screens of various electronic devices are getting larger and larger.
  • the screen of the electronic device can often be divided into two areas, which can meet the needs of users to compare the contents of the two areas.
  • the user can display the windows of two applications on the electronic device at the same time through the split-screen mode provided by the electronic device, so as to compare the content of the two applications. For example, the user is reading the summary of the document. At this time, you can take a screenshot of the summary content of the current page through gesture operations. After the screenshot is completed, if the user wants to continue reading the content of the document against the summary, he needs to trigger by gesture to enter In the split-screen mode, the user needs to click on the gallery in the split-screen mode to find a screenshot of the summary in the gallery to achieve comparative browsing of the document.
  • the operation process of the above-mentioned method is relatively cumbersome.
  • the present application provides a display method and electronic device to simplify the operation steps for realizing comparison browsing, and improve the efficiency of the operation and the user experience.
  • the present invention provides a display method that can be applied to an electronic device with a display screen.
  • the method includes: displaying a first interface of a first application; detecting a first operation of a user; One operation, the first interface is displayed in the first area of the display screen, and the second interface is displayed in the second area of the display screen; the second operation of the user is detected; in response to the second operation, save The first image corresponding to the first interface, and the thumbnail of the first image is displayed on the second interface.
  • an interface of application A is displayed on the display screen of the electronic device, and then the display screen of the electronic device can be divided into two areas through the first operation of the user, that is, the first operation can start and display
  • the screen is divided into working modes of two display areas.
  • the display screen of the electronic device can display interface 1 and interface 2.
  • interface 2 is the interface of application A displayed before the mode is turned on, and then the electronic device can respond to the user's operation on interface 2, save the image corresponding to interface 2, and display the image of interface 2 in the form of thumbnails In interface 1, in this way, the user can directly compare the thumbnails in interface 2 and interface 1, without having to open the gallery and then find the thumbnail, which can improve the user's operation efficiency and thus enhance the user experience.
  • a text description corresponding to the thumbnail of the first image is also displayed on the second interface.
  • the text description corresponding to the thumbnail of the first image can also be displayed on the second interface, which can facilitate the user to clearly understand the content corresponding to the thumbnail, facilitate subsequent operations of the user, and improve user experience.
  • the method further includes: detecting a third operation acting on the thumbnail of the first image, and displaying the first image in the second area.
  • the first image when the third operation of the thumbnail of the first image by the user is detected, the first image can be displayed in the second area, which can facilitate the user to compare and browse.
  • the user can click the thumbnail of the first image to display the thumbnail of the first image in full screen on the second interface, or the user can display the thumbnail of the first image in full screen on the second interface through a gesture expansion operation , So that when the user continues to operate on the first interface, he can compare the full-screen thumbnails on the second interface.
  • the method further includes: in response to the second operation, saving the first progress corresponding to the first image.
  • the first image is an image saved when the user operates the first interface to reach the first progress.
  • the electronic device can respond to the user's operation and save the progress corresponding to the first image on the first interface, which can facilitate the user to operate on the first interface while checking the progress on the second interface, so as to facilitate It can be adjusted in time when adjusting.
  • the key information on the first interface can be understood as the pattern formed by each step of the user's operation on the first interface, that is to say, what the user draws on the first interface schedule.
  • the method further includes:
  • a fourth operation acting on the first interface is detected; in response to the fourth operation, a third interface is displayed in the first area of the display screen, and the image corresponding to the third interface is the second image, so The second image is the image corresponding to the second progress when the user operates the first interface, and the second progress is the progress after the first progress; the fifth operation acting on the first image is detected ; In response to the fifth operation, a first image is displayed in the first area of the display screen.
  • the user can continue to perform operations on the first interface, for example, perform the next operation on the first image, and an updated image has been formed, because the updated image is in the first interface.
  • the next operation performed after the image can be understood as the progress of the updated image as the progress after the progress of the first image. If the user feels dissatisfied with the updated image and wants to restart the operation from the first image, the electronic device can respond to the user's fifth operation on the first image to restore the current image progress to the first image progress for convenience The user can operate from the progress of the first image, thereby improving the user experience.
  • the user can continue to draw on the first interface. For example, the user continues to draw two steps down the progress of the first image to form a second image. For example, the progress of the second image can be recorded as progress 2.
  • the user wants to redraw the picture from progress 1, he can operate on the thumbnail of progress 1 displayed on the second interface, and restore the first image corresponding to the thumbnail of progress 1 to the first interface, which is convenient for the user at any time Saving and restoring progress can improve user experience.
  • the first operation includes: starting an operation of a preset working mode from a notification bar of the display screen. It should be noted that the preset working mode is recorded as "dual screen linkage mode" in this application.
  • the user can activate the preset working mode in a variety of ways, so that the display screen is divided into two display areas.
  • the dual-screen linkage mode can be enabled through the status bar, voice commands or in the "settings" interface, that is, the display screen is divided into two display areas, the first area displays the first interface, and the second area displays the second interface. It should be noted that after the mode is turned on, one area of the display screen displays the operation instructions of the mode, and the other area can display the main interface of the electronic device or the interface of the electronic device before the mode is turned on.
  • the user can use the status bar, for example, by sliding down from the top of the display screen to open the status bar, and use the switch in the status bar to activate the dual-screen linkage mode.
  • the user can also wake up by voice, such as "Xiaoyixiaoyi” wake up the voice assistant, and then input "please help me turn on the dual-screen linkage mode" in the voice assistant, and then the electronic device can respond to the voice command To turn on the dual-screen linkage mode.
  • voice such as "Xiaoyixiaoyi” wake up the voice assistant
  • the electronic device can respond to the voice command To turn on the dual-screen linkage mode.
  • the second operation includes: a sliding operation from top to bottom in the first area, or a sliding operation from bottom to top in the first area.
  • first area may be located on the left side of the second area, and the first area may also be located on the right side of the second area.
  • the user can swipe from top to bottom along the right edge of the display screen or slide along the display screen. Slide from bottom to top on the right edge of to save the content on the first interface.
  • the content of the first interface is saved, all the content on the first interface can be saved, and part of the content on the first interface can also be saved.
  • the third operation includes: a tap operation and a gesture expansion operation.
  • the user can display the thumbnail displayed on the second interface in full screen on the second interface through a tap operation or a gesture expansion operation.
  • the gesture operation for full-screen display of the thumbnail on the second interface is not limited to the above examples, as long as the gesture operation for full-screen display can be performed in this application, it is not limited.
  • the fifth operation includes: a sliding operation from the second area to the first area.
  • the user can slide the thumbnail on the second interface from left to right.
  • the electronic device can correspond the thumbnail to The image is restored to the first interface.
  • the second interface is also used to display the operation instructions of the preset working mode, and the operation instructions of the preset working mode include the first operation, the second operation, the third operation, and the first operation. At least one of the function descriptions corresponding to the five operations.
  • the second interface on the display screen may display the operation instructions of the preset working mode.
  • the first operation can be understood as turning on the preset working mode.
  • the operation instructions of the preset working mode may include: how to turn on the mode, how to turn off the mode, how to take a screenshot in this mode, how to save the progress, how to delete thumbnails, and so on. The user can perform the corresponding operation according to the mode operation instructions.
  • the second operation may be used to save the first image corresponding to the first interface or to save the first progress corresponding to the first image.
  • the function description corresponding to the second operation can be: slide up and down to take a screenshot, and slide up and down to save the progress.
  • the screenshot can be understood as saving the first image corresponding to the first interface.
  • the first interface of the first application is displayed on the main screen of the folding screen, and the second interface of the first application is displayed on the secondary screen of the folding screen.
  • the second interface of the application is a possible implementation manner, when the display screen is a folding screen, the first interface of the first application is displayed on the main screen of the folding screen, and the second interface of the first application is displayed on the secondary screen of the folding screen. The second interface of the application.
  • the display screen when the display screen is a folding screen, for example, the display screen can be divided into a main screen and a secondary screen, and then the first interface of the first application can be displayed on the main screen, and the second application of the second application can be displayed on the secondary screen. interface.
  • the first type the first interface is displayed on the main screen of the folding screen, and the second interface and the operation instructions of the preset working mode are displayed on the secondary screen of the folding screen.
  • the second type display the first interface and the operation instructions of the preset working mode on the main screen of the folding screen, and display the second interface on the secondary screen of the folding screen.
  • the third type the first interface is displayed on the main screen of the folding screen, and the operation instructions of the preset working mode are displayed on the secondary screen of the folding screen.
  • the fourth type the operation instructions of the preset working mode are displayed on the main screen of the folding screen, and the second interface is displayed on the secondary screen of the folding screen.
  • the display mode is not limited to the above four cases, and this mode can also be applied on the primary and secondary screens at the same time, which is not limited in this application.
  • the present application also provides an electronic device including a display screen; one or more processors; one or more memories; one or more sensors; multiple applications; and one or more computer programs Wherein the one or more computer programs are stored in the one or more memories, and the one or more computer programs include instructions, when the instructions are called for execution by the one or more processors,
  • the electronic device is enabled to implement the above-mentioned first aspect and any possible design technical solution of the first aspect.
  • this application also provides an electronic device, which includes modules/units that execute the first aspect or any one of the possible design methods of the first aspect; these modules/units can be implemented by hardware or Implementation of the corresponding software through hardware.
  • an embodiment of the present application also provides a chip, which is coupled with a memory in an electronic device, and executes the first aspect of the embodiments of the present application and any possible design technical solutions of the first aspect; an embodiment of the present application "Coupled" means that two components are directly or indirectly combined with each other.
  • a computer-readable storage medium in a fifth aspect, includes a computer program.
  • the computer program runs on an electronic device, the electronic device executes the first embodiment of the present application. Aspect and any possible design technical solution of the first aspect.
  • a computer program product in the embodiments of the present application when the computer program product runs on an electronic device, causes the electronic device to perform the first aspect of the embodiments of the present application and any of the first aspects thereof Designed technical solutions.
  • FIG. 1 is a hardware structure diagram of a mobile phone 100 provided by an embodiment of the application
  • FIG. 2 is a schematic diagram of a folding screen provided by an embodiment of the application.
  • FIG. 3A is a schematic diagram of a software architecture provided by an embodiment of this application.
  • 3B is a schematic diagram of data flow of an Android operating system provided by an embodiment of the application.
  • FIG. 4 is a schematic flowchart of a method provided by an embodiment of this application.
  • 5A is a schematic diagram of a dual-screen linkage mode setting interface provided by an embodiment of the application.
  • FIG. 5B is a schematic diagram of an interface of a group of mobile phones 100 provided by an embodiment of the application.
  • 5C to 5G are schematic diagrams of the interface of the mobile phone 100 provided by an embodiment of the application.
  • 6A to 6B are schematic diagrams of the interface of the mobile phone 100 provided by an embodiment of the application.
  • FIGS. 7A to 7C are schematic diagrams of the interface of the mobile phone 100 provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of another electronic device provided by an embodiment of the application.
  • the application (application, App) involved in the embodiments of the present application is a software program that can implement one or more specific functions.
  • multiple applications can be installed in an electronic device.
  • the applications mentioned in the following may be applications that have been installed when the electronic device is shipped from the factory, or applications that the user downloads from the network or obtained from other electronic devices during the use of the electronic device.
  • the method provided in the embodiments of the present application can be applied to electronic devices with a display screen and capable of dividing the display screen into two areas to display respectively, such as mobile phones, tablet computers, and wearable devices (for example, watches). , Bracelets, smart helmets, etc.), in-vehicle devices, smart homes, augmented reality (AR)/virtual reality (VR) devices, laptops, ultra-mobile personal computers (UMPC) , Netbooks, personal digital assistants (personal digital assistants, PDAs), etc., which are not limited in the embodiments of this application.
  • the electronic device involved in the embodiment of the application may also be a foldable electronic device, such as a foldable mobile phone, a foldable tablet computer, etc., which is not limited in this application.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142, Antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, A display screen 194, a subscriber identification module (SIM) card interface 195, and so on.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 100. The controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transfer data between the mobile phone 100 and peripheral devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the mobile phone 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the wireless communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellite systems. (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive a signal to be sent from the processor 110, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the mobile phone 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the display screen 194 is used to display the display interface of the application and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the mobile phone 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the display screen 194 When the display screen 194 is a flexible screen, the user can fold the display screen 194.
  • the display screen 194 can be divided into two display areas, namely, display area 1 and display area 2.
  • a certain angle ⁇ exists between the two display areas divided by the display screen 194.
  • the display area 1 may be referred to as the main screen of the mobile phone 100
  • the display area 2 may be referred to as the secondary screen of the mobile phone 100.
  • the display area of the main screen and the secondary screen can be the same or different.
  • the user can fold the screen along one or more fold lines in the display screen 194.
  • the position of the folding line may be preset, or may be arbitrarily selected by the user on the display screen 194.
  • the mobile phone 100 when the included angle ⁇ between the main screen and the secondary screen is greater than a set threshold (for example, 170°), the mobile phone 100 can determine that the display screen 194 is in an unfolded state. When the angle ⁇ between the main screen and the secondary screen is less than the set threshold, the mobile phone 100 can determine that the display screen 194 is in a folded state.
  • the physical form of the display screen 194 can be divided into an expanded state and a non-expanded state.
  • the camera 193 is used to capture still images or videos.
  • the camera 193 may include a front camera and a rear camera.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, and software codes of at least one application program (for example, an iQiyi application, a WeChat application, etc.).
  • the data storage area can store data (such as images, videos, etc.) generated during the use of the mobile phone 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the mobile phone 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save pictures, videos and other files in an external memory card.
  • the mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the touch sensor 180A is also called “touch panel”.
  • the touch sensor 180A may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180A and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180A is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180A may also be disposed on the surface of the mobile phone 100, which is different from the position of the display screen 194.
  • the sensors in the mobile phone 100 may also include a pressure sensor 180B, a gyroscope sensor 180C, an air pressure sensor 180D, a magnetic sensor 180E, an acceleration sensor 180F, a distance sensor 180G, a proximity light sensor 180H, a temperature sensor 180J, and a bone conduction sensor. 180L, etc.
  • the mobile phone 100 may also include a button 190 (such as a power-on button, a volume button, etc.), a motor 191, an indicator 192, a SIM card interface 195, and the like.
  • the components shown in Figure 1 do not constitute a specific limitation on the mobile phone.
  • the mobile phone may also include more or less components than those shown in the figure, or combine some components, or split some components, or different The layout of the components.
  • the mobile phone 100 shown in FIG. 1 is taken as an example for introduction.
  • the above-mentioned software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100 by way of example.
  • FIG. 3A is a block diagram of the software structure of the mobile phone 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages. As shown in Figure 3A, applications such as gallery, calendar, call, map, navigation, System UI, activity in the first area, and activity in the second area can be installed in the application layer.
  • applications such as gallery, calendar, call, map, navigation, System UI, activity in the first area, and activity in the second area can be installed in the application layer.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a display policy service, a screenshot service, and a display management service (display manager service, DMS).
  • display manager service display manager service
  • the application framework layer can also include activity manager, window management service (WMS), content provider, view system, phone manager, resource manager, notification manager, etc.
  • WMS window management service
  • the window manager provides a window manager service for the window to control and manage the interface displayed on the display screen.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build the interface of the application.
  • the phone manager is used to provide the communication function of the electronic device. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • Android runtime includes core libraries and virtual machines.
  • Android runtime is the runtime environment of the Android operating system, responsible for the scheduling and management of the Android operating system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of the Android operating system.
  • the application layer and application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: state detection module, gesture recognition module, screenshot service module, media library, surface manager, image processing library, etc.
  • the state detection module is used to identify the physical form of the display screen of the electronic device.
  • the state detection module can be used to determine the physical form of the display screen according to the sensor data uploaded by various sensors in the hardware layer.
  • the gesture recognition module is used to recognize the user's gesture operation on the display screen of the electronic device.
  • gesture operations include user touch operations on the display screen, such as sliding operations, clicking operations, and pressing operations.
  • the media library supports the playback and recording of audio and video in multiple formats, and supports the opening of static images in multiple formats.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the graphics processing library is used to implement three-dimensional graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between the hardware and software of an electronic device.
  • the kernel layer contains at least display drivers, sensor drivers, camera drivers, audio drivers, etc., which are used to drive the hardware in the hardware layer.
  • the hardware layer may include various types of sensors (such as touch sensors, etc.), display screens, cameras, and so on.
  • System UI can monitor the gesture recognition module. After the sensor detects data, it can report the detected data to the sensor driver, and then the sensor driver reports the detected data to the gesture recognition module. After the gesture recognition module recognizes the gesture operation, System UI can call the corresponding service according to the result recognized by the gesture recognition module. For example, when the gesture recognition module recognizes that the gesture operation is a screenshot operation of the second area Activity, it can call the screenshot service , Take a screenshot of the activity in the second area, and then send the screenshot after the screenshot to the activity in the first area.
  • first area Activity in this application represents the display interface of the first area on the display screen 194
  • second area Activity represents the display interface of the second area on the display screen 194
  • System UI represents a system-level application
  • Functions include: status bar information display, task bar display panel, notification panel, etc.
  • At least one involved in the following embodiments includes one or more; wherein, multiple refers to greater than or equal to two.
  • words such as “first” and “second” are only used for the purpose of distinguishing description.
  • the method may include the following steps:
  • Step 401 Set the dual-screen linkage mode.
  • the mobile phone 100 sets a new working mode.
  • the display screen of the mobile phone 100 can be divided into two areas, that is, two windows can be displayed on the display screen at the same time.
  • the working mode is denoted as the "dual-screen linkage mode" below, and the working mode may also have other names, which is not limited in this application.
  • the display screen of the mobile phone 100 is divided into two areas, and in the initial state after the mode is activated, one area displays the operating instructions of the dual-screen linkage mode, for example, it is marked as the first In one interface, the second interface of the mobile phone 100 is displayed in the other area.
  • it may be an operation interface of an application program or the main interface of the mobile phone 100.
  • FIG. 5A a schematic diagram of a dual-screen linkage mode setting interface provided by this embodiment of the application.
  • the user can click on the "Settings" application on the mobile phone, find the “System and Update” interface 500 on the setting interface, and then click Click “Dual-screen linkage mode” 501 on this interface to enter the "Dual-screen linkage mode” interface 510, which can display a schematic diagram of the interface in dual-screen linkage mode, click "Apply” 502, and the mobile phone 100 can enter dual-screen Linkage mode.
  • the "application” 502 is used to enable the dual-screen linkage mode.
  • the sizes of the two regions divided by the display screen may be the same or different, which is not limited in the present application.
  • the switch of the dual-screen linkage mode can also be set in the notification bar on the display.
  • the user can open the notification bar through gesture operations (such as sliding down from the top of the screen, etc.), and then directly click on the notification bar.
  • the switch is used to start the dual-screen linkage mode, or the user can also initiate the dual-screen linkage mode through a certain gesture operation, etc., which is not limited in this application.
  • Step 402 The first area of the display screen displays the mode operation instructions, and the second area displays the mobile phone interface.
  • the two areas of the display screen in the dual-screen linkage mode can be marked as the "first area” and the "second area” respectively. It is understandable that the first area and the second area are only used to distinguish two areas on the display screen. The positional relationship between the first area and the second area is not limited in this application.
  • the first area and the second area can be left and right. They are arranged side by side (as shown in the lower left figure of FIG. 5B), or arranged up and down (that is, one of them is located above the other), etc., which is not limited in this application.
  • the display screen may be divided into a first area and a second area.
  • the first area can display the operation instructions of this mode, that is, the first interface
  • the second interface of the mobile phone 100 can be displayed on the second area.
  • the operating instructions of the dual-screen linkage mode include but are not limited to the following: how to take a screenshot in this mode, how to delete the screenshot, how to save the task progress, how to exit, etc.
  • the lower left picture of Figure 5B shows: 1. Swipe down to take a screenshot; 2. Click the "x" displayed in the full screen to exit the full screen display; 3. Click the "x" of the thumbnail to delete the thumbnail; 4. Slide up to save the task schedule.
  • the first area may also display a thumbnail, and the content of the thumbnail is a screenshot of the second interface or key information in the second interface.
  • the key information in the second interface may be the user's operation step information in the second interface, for example, a pattern formed by each step of the user's operation when drawing a picture.
  • the screenshot of the second interface when the screenshot of the second interface is displayed in the first area, the screenshot may include information in the status bar of the mobile phone 100, such as time information, power information, and so on.
  • the key information in the second interface is displayed in the first area, the information in the status bar of the mobile phone 100 is not displayed in the thumbnail.
  • the second interface may display the main interface of the mobile phone 100 by default, for example, as shown in FIG. 5B. That is, no matter whether the interface of the mobile phone 100 displays the main interface or the interface of an application program before the mode is turned on, when the mode is turned on, the second area can directly display the main interface of the mobile phone 100.
  • the second interface may be the interface that the user was displaying before turning on the mode.
  • the interface of the mobile phone 100 is as shown in the interface 520 of the gallery application in FIG.
  • the initial interface displayed by the mobile phone 100 (the interface before the switch is turned on) can be Displayed in the lower level of the notification bar.
  • the drawings in this application are only an illustration, and the user can also open the notification bar through voice control, or open the notification bar through an air operation (for example, a gesture operation at the top of the screen).
  • the dual-screen linkage mode can also be activated through voice control, or the dual-screen linkage mode can be activated through an air-space operation, which is not limited in this application.
  • the mobile phone 100 is a folding screen mobile phone
  • taking the display screen of the folding screen mobile phone including a main screen and a secondary screen the main screen is located on the right side of the secondary screen
  • the following situations may be included:
  • the first type After the dual-screen linkage mode is turned on, this mode can be applied to the secondary screen by default, and the main screen continues to display the original interface (that is, the interface that was displayed on the main screen before the dual-screen linkage mode was turned on). That is, after the dual-screen linkage mode is turned on, the secondary screen can be divided into a first area and a second area, that is, the second area can be displayed on the secondary screen, for example, as shown in FIG. 5D.
  • the second type After the dual-screen linkage mode is turned on, this mode can be applied on the main screen by default, and the secondary screen continues to display the original interface (that is, the interface that was displayed on the main screen before the dual-screen linkage mode was turned on). That is, after the dual-screen linkage mode is turned on, the main screen can be divided into a first area and a second area, that is, the second area can be displayed on the main screen, for example, as shown in FIG. 5E.
  • the secondary screen can replace the operation instructions for the mode, and the main screen continues to display the original interface (that is, the interface that the main screen was displaying before the dual-screen linkage mode was turned on), for example, see Figure 5F Show.
  • the main screen can replace the operation instructions for the mode, and the secondary screen continues to display the original interface (that is, the interface that the secondary screen is displaying before the dual-screen linkage mode is turned on), for example, see Figure 5G Shown.
  • this mode can also be applied on the primary and secondary screens by default, that is, the primary screen and the secondary screen are divided into two areas at the same time.
  • the application is not limited.
  • Step 403 Detect the user's gesture operation in the second area.
  • the sensor in the hardware layer of the mobile phone 100 can report the detected data to the hardware driver of the driver layer, and then the hardware driver of the driver layer reports the data detected by the sensor to the gesture recognition module.
  • the touch sensor can detect the event corresponding to the user's sliding operation in the second area, and report the event corresponding to the sliding operation to the sensor driver of the driver layer, and the sensor driver reports the sliding operation detected by the touch sensor to the gesture Identification module.
  • Step 404 If the user's gesture operation in the second area is a sliding operation from top to bottom, a screenshot of the content of the second area is taken, and the screenshot is displayed in the first area.
  • FIG. 6A a set of interface schematic diagrams provided in this embodiment of the application is shown.
  • the display of the mobile phone 100 can display the first The interface and the second interface are, for example, as shown in (a) in FIG. 6A.
  • the operation instructions in the dual-screen linkage mode are displayed on the first interface, and the main interface of the mobile phone 100 is displayed on the second interface.
  • the operation description for the dual-screen linkage mode shown in FIG. 6A is not limited to the description listed in the above schematic diagram, and this application does not limit this.
  • the photo 1 can be displayed in full screen on the second interface, for example, as shown in (c) of FIG. 6A. If the user swipes from top to bottom on the right edge of the second interface, as shown in (d) in FIG. 6A, the touch sensor of the mobile phone 100 can detect the gesture operation from top to bottom along the right edge of the display screen, The gesture recognition module recognizes that the gesture operation is a sliding operation on the second interface, and the sliding operation is a screenshot operation.
  • System UI when System UI monitors that the gesture operation recognized by the gesture recognition module is a screenshot operation on the second interface, it can obtain the interface currently displayed in the second area, and call the screenshot service of the application framework layer to obtain Take a screenshot of the interface currently displayed in the second area (ie, the second interface) to get the thumbnail 1 of the second interface (hereinafter referred to as "screenshot 1"), and then the System UI can send the screenshot 1 to the activity in the first area, For example, refer to the first interface shown in (d) in FIG. 6A.
  • the user can continue to slide the photo on the second interface.
  • the user slides the page from right to left on the second interface, such as sliding to the next picture (hereinafter referred to as "photo 2 "), for example, as shown in Figure 6B (f)
  • photo 2 next picture
  • the user can click on the screenshot 1 on the first interface, for example, as shown in Figure 6B (g), and then The screenshot 1 clicked by the user can be displayed on the first interface in full screen, and the photo 2 can be displayed on the second interface, for example, as shown in (h) in FIG. 6B.
  • buttons "x" in the upper right corner of the displayed screenshot there is a button "x" in the upper right corner of the displayed screenshot.
  • the user can click The button "x" exits the full screen or deletes the screenshot.
  • the user clicks "x" to delete the screenshot If the user clicks on the screenshot 1 displayed in full screen on the first interface in the schematic diagram shown in (h) in FIG. 6B, the screenshot 1 can be exited from the full screen display.
  • the full-screen display can also be used to zoom in or zoom out the screenshot through gesture operations.
  • tags can be added to the screenshots displayed on the first interface to distinguish each screenshot.
  • a number tag such as 1, 2, etc.
  • tags added on the screenshot may or may not be displayed, which is not limited in this application.
  • the thumbnail can be directly displayed on the first area, so that the content comparison between the first area and the second area can be realized, and the user does not need to open the two applications, thereby simplifying the operation steps.
  • Step 405 If the user's gesture operation in the second area is a bottom-up sliding operation, save the key information of the second area and display it in the first area. It should be noted that the key information of the second area can be understood as multiple patterns drawn by the user involved in the following embodiments.
  • FIG. 7A a set of schematic diagrams of interfaces provided in this embodiment of the application.
  • the first interface and the second interface can be displayed on the display screen of the mobile phone 100, as shown in (a) of FIG. 7A, for example.
  • the first interface displays the operating instructions of the dual-screen linkage mode
  • the second interface displays the interface before the mode is started, such as the interface of a certain drawing application.
  • pattern 1 The pattern shown in (b) in 7A (hereinafter referred to as "pattern 1"), at this time, if the user wants to save the current pattern 1, he can follow the gesture operation in the operating instructions in the dual-screen linkage mode, For example, as shown in (b) of FIG. 7A, a sliding operation of sliding from bottom to top along the right edge of the display screen.
  • the gesture recognition module recognizes that the sliding operation from bottom to up along the right edge of the display screen is used to save the task progress.
  • Gesture operation After the System UI monitors the gesture operation, it can obtain the current task progress on the second area, and call the task management service of the framework layer, and then use the interface to save the activity state in the task management service to the obtained second area
  • the current task progress (that is, the progress of pattern 1 on the second interface) is saved in progress (for example, the progress of the saved pattern 1 can be recorded as progress 1).
  • the System UI can send progress 1 to the first area Activity, for example, as shown in (c) in FIG. 7A.
  • progress saving can be understood as saving the user's operation steps on the interface.
  • the user can continue to perform the next operation on the second interface, assuming that the user draws the next stroke on the second interface to form an updated pattern, for example, pattern 2 on the second interface shown in Figure 7A (d)
  • the user wants to save the current progress of pattern 2 (for example, recorded as progress 2)
  • he can slide from bottom to top along the right edge of the display screen, as shown in (e) in Figure 7B, and then the gesture recognition module can recognize
  • the sliding operation of sliding from bottom to top along the right edge of the display screen is a gesture operation for saving task progress.
  • the System UI monitors the gesture operation, it can obtain the current task progress on the second area (that is, progress 2), and call the task management service of the framework layer, and then obtain it through the interface pair that saves the activity state in the task management service The current task progress on the second area (that is, progress 2) is saved. Then the System UI can send progress 2 to the first area Activity. For example, refer to the first interface shown in Figure 7B (f), and the user can continue to perform the next operation on the second interface. For example, the interface after the operation is shown in the figure As shown in (g) in 7B.
  • Step 4051 If the user's gesture operation on the first area is a sliding operation from left to right, the content of the first area is restored to be displayed in the second area.
  • Method A Refer to (h1) in Figure 7B.
  • the user can click Progress 1 on the first interface and slide Progress 1 from left to right.
  • the sensor of the mobile phone 100 detects that the user is moving from left to right on the first interface.
  • the gesture recognition module can recognize that the left to right sliding operation is an operation for restoring task progress.
  • the System UI monitors the left-to-right sliding operation, it can call the task management service of the framework layer and restore progress 1 to the second interface through the interface of restoring the Activity state, that is, the operation of displaying progress 1 on the second interface
  • the interface for example, is shown in (k) in Figure 7C below.
  • Mode B The user can click Progress 1 on the first interface, see (h2) in Figure 7C, and then the full screen can display Progress 1 on the first interface, see (i) in Figure 7C, at this time the user On the first interface shown in (i) in FIG. 7C, slide progress 1 from left to right, and after the sensor of the mobile phone 100 detects the user's sliding operation from left to right on the first interface, the gesture recognition module can It is recognized that the sliding operation from left to right is an operation for restoring the task progress.
  • the System UI After the System UI monitors the left-to-right sliding operation, it can call the task management service of the framework layer and restore progress 1 to the second interface through the interface of restoring the Activity state, that is, the operation of displaying progress 1 on the second interface
  • the interface that is, the operation interface that resumes displaying the progress 1 on the second interface, for example, as shown in (k) in FIG. 7C.
  • the previous progress can still be saved and displayed on the first interface (for example, it can be saved in the form of a copy).
  • this can also facilitate the user's subsequent operations. For example, after the user restores the previous progress (such as progress 1), continue the operation, and when it reaches a certain step and want to restart from progress 1, it can also Restore progress 1 to the second interface again.
  • the progress in the user operation process can be compared, and the previous steps can be restored at any time, which improves the user experience.
  • the mobile terminal device may include a hardware structure and/or a software module, which implements the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function among the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • the method provided in the embodiments of the present application is introduced from the perspective of an electronic device as an execution subject.
  • the terminal device may include a hardware structure and/or a software module, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function among the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • the electronic device 800 includes: a display screen 801; one or more processors 802; one or more memories 803; one or more sensors 804 and a plurality of applications 805 (not shown in the figure) And one or more computer programs 806 (not shown in the figure), the above-mentioned devices can be connected through one or more communication buses 807.
  • the display screen 801 is used to display the main interface, or the display interface of the application in the electronic device, or the operation instructions of the dual-screen linkage mode.
  • One or more computer programs are stored in the memory 803, and the one or more computer programs include instructions; the processor 802 calls the instructions stored in the memory 803, so that the electronic device 800 can perform the following steps:
  • the display screen 801 displays the first interface of the first application; the first operation of the user is detected; in response to the first operation, the first interface is displayed in the first area of the display screen 801, and the first interface is displayed on the display screen 801.
  • the second area of 801 displays the second interface; the second operation of the user is detected; in response to the second operation, the first image corresponding to the first interface is saved, and the thumbnail of the first image is displayed in The second interface.
  • a text description corresponding to the thumbnail of the first image is also displayed on the second interface.
  • the electronic device when the instruction is invoked and executed by the one or more processors 802, the electronic device is caused to further execute the following steps:
  • a third operation acting on the thumbnail of the first image is detected, and the second area of the display screen 801 displays the first image.
  • the first image is an image saved when the user operates the first interface to reach the first progress; when the instruction is invoked and executed by the one or more processors 802, all The electronic device also performs the following steps:
  • the first progress corresponding to the first image is saved.
  • the electronic device when the instruction is invoked and executed by the one or more processors 802, the electronic device further executes the following steps after displaying the first image in the second area:
  • a fourth operation acting on the first interface is detected; in response to the fourth operation, a third interface is displayed in the first area of the display screen, and the image corresponding to the third interface is the second image, so The second image is the image corresponding to the second progress when the user operates the first interface, and the second progress is the progress after the first progress; the fifth operation acting on the first image is detected ; In response to the fifth operation, a first image is displayed in the first area of the display screen.
  • the first operation includes: starting an operation of a preset working mode from a notification bar of the display screen.
  • the second operation includes: a sliding operation from top to bottom in the first area, or a sliding operation from bottom to top in the first area.
  • the third operation includes: a tap operation and a gesture expansion operation.
  • the fifth operation includes: a sliding operation from the second area to the first area.
  • the second interface of the display screen 801 is also used to display operating instructions for a preset working mode, and the operating instructions for the preset working mode include the first operation, the second At least one of the function descriptions corresponding to the operation, the third operation, and the fifth operation.
  • the display screen 801 is a folding screen
  • the first interface of the first application is displayed on the main screen of the folding screen 801
  • the first interface is displayed on the secondary screen of the folding screen. 2.
  • the second interface of the application is displayed.
  • the electronic device when the instruction is invoked and executed by the one or more processors 802, the electronic device further executes the following steps after responding to the first operation:
  • the first interface is displayed on the main screen of the folding screen
  • the second interface and the operation instructions of the preset working mode are displayed on the secondary screen of the folding screen; or all the operation instructions are displayed on the main screen of the folding screen.
  • the first interface and the operation instructions of the preset working mode, the second interface is displayed on the secondary screen of the folding screen; or the first interface is displayed on the main screen of the folding screen, and the first interface is displayed on the folding screen.
  • the operating instructions of the preset working mode are displayed on the secondary screen of the folding screen; or the operating instructions of the preset operating mode are displayed on the main screen of the folding screen, and the second interface is displayed on the secondary screen of the folding screen.
  • the processor 802 may be a general-purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. Or execute the methods, steps, and logical block diagrams disclosed in the embodiments of the present application.
  • the general-purpose processor may be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in combination with the embodiments of the present application may be directly embodied as being executed and completed by a hardware processor, or executed and completed by a combination of hardware and software modules in the processor.
  • the software module may be located in the memory 803, and the processor 802 reads the program instructions in the memory 803, and completes the steps of the foregoing method in combination with its hardware.
  • the memory 803 may be a non-volatile memory, such as a hard disk drive (HDD) or a solid-state drive (SSD), etc., or a volatile memory (volatile memory).
  • a non-volatile memory such as a hard disk drive (HDD) or a solid-state drive (SSD), etc.
  • a volatile memory volatile memory
  • the memory may also be any other medium that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto.
  • the memory in the embodiments of the present application may also be a circuit or any other device capable of realizing a storage function for storing instructions and/or data.
  • the present application also provides a computer storage medium in which a computer program is stored, and when the computer program is executed by a computer, the computer executes the display method provided in the above embodiment.
  • the embodiments of the present application also provide a computer program product, including instructions, which when run on a computer, cause the computer to execute the display method provided in the above embodiments.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种显示方法及电子设备,应用于具有显示屏的电子设备,该方法包括:显示第一应用的第一界面;检测到用户的第一操作;响应于所述第一操作,在所述显示屏的第一区域显示所述第一界面,在所述显示屏的第二区域显示第二界面;检测到用户的第二操作;响应于所述第二操作,保存所述第一界面对应的第一图像,并将所述第一图像的缩略图显示在所述第二界面。通过本申请的方法,可以实现内容的对照浏览,并且用户不需要打开多个应用程序,简化了操作步骤,进而提升用户体验。

Description

一种显示方法及电子设备
相关申请的交叉引用
本申请要求在2020年02月04日提交中国专利局、申请号为202010079955.6、申请名称为“一种显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种显示方法及电子设备。
背景技术
为了提升视觉体验,各种电子设备的屏幕越来越大。针对大屏电子设备,往往可以将电子设备的屏幕分为两个区域,能满足用户可以对照两个区域内容的需求。
目前,用户可以通过电子设备提供的分屏模式,实现在电子设备上同时显示两个应用的窗口,从而对照两个应用的内容。例如,用户正在阅读文档的摘要,此时可以通过手势操作对当前页面的摘要内容进行截图,当截图完成后,如果用户想要对照着摘要继续阅读文档后面的内容,则需要通过手势触发,进入分屏模式,然后用户需要在分屏模式下点击图库,在图库中找到摘要的截图,以实现文档的对照浏览。上述这种方法的操作过程比较繁琐。
发明内容
本申请提供一种显示方法及电子设备,用以简化实现对照浏览的操作步骤,提升操作的效率和用户体验。
第一方面,本发明提供一种显示方法,该方法可应用于具有显示屏的电子设备,该方法包括:显示第一应用的第一界面;检测到用户的第一操作;响应于所述第一操作,在所述显示屏的第一区域显示所述第一界面,在所述显示屏的第二区域显示第二界面;检测到用户的第二操作;响应于所述第二操作,保存所述第一界面对应的第一图像,并将所述第一图像的缩略图显示在所述第二界面。
在本申请中,电子设备的显示屏上显示一个应用A的界面,然后电子设备的显示屏可通过用户的第一操作,被划分为两个区域,也就是说,第一操作可以启动将显示屏分为两个显示区域的工作模式,当启动该模式后,电子设备的显示屏上可显示界面1和界面2。例如,界面2为开启该模式之前显示的应用A的界面,然后电子设备可响应用户在界面2上的操作,将界面2对应的图像进行保存,并且将界面2的图像以缩略图的形式显示在界面1,这样用户可以直接将界面2和界面1中的缩略图进行对照,无需打开图库再找到缩略图,能够提高用户的操作效率,进而提升用户体验。
需要说明的是,用户通常在画图或者阅读文档等这些场景下,为了方便操作或者浏览,可能需要对照着两个界面,因此,本申请中可在使用一些与上述场景强相关的应用程序,例如一些办公软件、画图软件、游戏软件等时,需要开启该模式,以实现两个界面的对照, 方便用户操作。当然,可以理解的是,其它的应用程序也可以应用该模式,本申请对此不作限定。
在一种可能的实施方式中,所述第二界面上还显示所述第一图像的缩略图对应的文字描述。
在上述技术方案中,第二界面上还可以显示第一图像的缩略图对应的文字描述,这样可以便于用户清楚的了解缩略图对应的内容,能够方便用户后续的操作,提升用户体验。
在一种可能的实施方式中,所述方法还包括:检测到作用于所述第一图像的缩略图的第三操作,第二区域显示所述第一图像。
在上述技术方案中,当检测到用户对第一图像的缩略图的第三操作时,可在第二区域显示第一图像,这样能够方便用户对照浏览。
示例性的,用户可以点击第一图像的缩略图,将第一图像的缩略图全屏显示在第二界面,或者用户可以通过手势扩张的操作将第一图像的缩略图全屏显示在第二界面上,这样用户在第一界面上继续操作时,可以对照着第二界面上全屏显示的缩略图。
在一种可能的实施方式中,所述方法还包括:响应于第二操作,保存所述第一图像对应的第一进度。
需要说明的是,所述第一图像为用户操作所述第一界面到达第一进度时保存的图像。
在该技术方案中,电子设备可以响应用户的操作,保存第一界面上第一图像对应的进度,这样可以方便用户在第一界面上操作的同时对照第二界面上的进度,以便于在需要调整时能够及时调整。
当然,可以理解的是,用户在保存第一界面对应的第一图像时可以将第一界面上的所有内容进行保存,也可以仅保存第一界面上的关键信息。示例性的,当用户在第一界面上进行画图时,第一界面上的关键信息可以理解为用户在第一界面上每一步操作所形成的图案,也就是说用户在第一界面上画图的进度。
在一种可能的实施方式中,在所述第二区域显示所述第一图像之后,所述方法还包括:
检测到作用于所述第一界面的第四操作;响应于所述第四操作,在所述显示屏的第一区域显示第三界面,所述第三界面对应的图像为第二图像,所述第二图像为用户操作所述第一界面到达第二进度时对应的图像,且所述第二进度为所述第一进度之后的进度;检测到作用于所述第一图像的第五操作;响应于所述第五操作,在所述显示屏的第一区域显示第一图像。
也就是说,当第二区域显示第一图像之后,用户可以继续在第一界面上进行操作,例如可对第一图像进行下一步操作,已形成更新的图像,由于更新的图像是在第一图像之后进行的下一步操作,则可以理解为更新的图像的进度为在第一图像的进度之后的进度。若更新的图像用户觉得不满意,想要从第一图像重新开始操作,则电子设备可以响应用户的对第一图像的第五操作,将当前的图像进度恢复到第一图像的进度,以方便用户能够从第一图像的进度处进行操作,进而提升用户体验。
示例性的,假设用户在第一界面上利用某个画图应用进行画图,第一界面上显示的是第一图像,将第一图像的进度记为进度1,然后将进度1保存在第二界面上,用户可以继续在第一界面上进行画图,例如用户接着第一图像的进度向下继续画了两步,形成了第二图像,例如可将第二图像的进度记为进度2,此时如果用户想要从进度1重新画图,则可以对第二界面上显示的进度1的缩略图进行操作,将进度1的缩略图对应的第一图像恢复 到第一界面上,这样可以便于用户随时保存并且恢复进度,能够提升用户体验。
在一种可能的实施方式中,所述第一操作包括:从所述显示屏的通知栏,启动预设工作模式的操作。需要说明的是,该预设工作模式在本申请中记为“双屏联动模式”。
在本申请中,用户可以通过多种方式来启动预设工作模式,以使显示屏划分为两个显示区域。示例性的,可以通过状态栏、语音指令或者在“设置”界面中开启双屏联动模式,即将显示屏分为两个显示区域,第一区域显示第一界面,第二区域显示第二界面。需要说明的是,开启该模式之后,显示屏的其中一个区域显示该模式的操作说明,另一个区域可以显示电子设备的主界面或者电子设备在开启该模式之前的界面等。
作为一种示例,用户可以通过状态栏,例如从显示屏的顶部向下滑动,打开状态栏,通过状态栏中的开关,从而启动双屏联动模式。作为又一种示例,用户也可以通过语音唤醒,例如通过“小艺小艺”唤醒语音助手,然后在语音助手中输入“请帮我打开双屏联动模式”,然后电子设备可响应该语音指令,开启双屏联动模式。可以理解的是,上述示例仅是一种示意性说明,本申请对此不作限定。
在一种可能的实施方式中,所述第二操作包括:在所述第一区域从上向下的滑动操作,或者在所述第一区域从下向上的滑动操作。
需要说明的是,本申请中对于第一区域和第二区域的位置关系并不作限定,例如第一区域可以位于第二区域的左边,第一区域也可以位于第二区域的右边等。
示例性的,假设用户对第一界面上的内容进行保存,且第一界面位于第二界面的右边,则用户可以通过沿着显示屏的右边缘从上向下的滑动操作或者沿着显示屏的右边缘从下向上的滑动操作对第一界面上的内容进行保存。当然,可以理解的是,对第一界面的内容进行保存时可以保存第一界面上的所有内容,也可以保存第一界面上的部分内容。
在一种可能的实施方式中,所述第三操作包括:点击操作、手势扩张操作。
在本申请中,用户可以通过点击操作或者手势扩张操作将第二界面上显示的缩略图在第二界面上进行全屏显示。当然,将缩略图在第二界面上进行全屏显示的手势操作并不限于上述举例,只要能够进行全屏显示的手势操作都可包含在本申请中,对此不做限定。
在一种可能的实施方式中,所述第五操作包括:从第二区域向第一区域的滑动操作。
示例性的,用户可以将第二界面上的缩略图从左向右进行滑动,当缩略图滑动到第一界面的边缘或者距离第一界面的边缘有一定距离时,电子设备可将缩略图对应的图像恢复到第一界面上。
在一种可能的实施方式中,所述第二界面上还用于显示预设工作模式的操作说明,所述预设工作模式的操作说明包括第一操作、第二操作、第三操作、第五操作分别对应的功能描述中的至少一项。
在本申请中,当用户执行第一操作之后,显示屏上的第二界面上可显示预设工作模式的操作说明。该第一操作可以理解为开启预设的工作模式。其中,预设工作模式的操作说明可包括:如何开启该模式,如何关闭该模式,在该模式下如何截图,如何保存进度,如何删除缩略图等。用户可以对照着模式操作说明执行对应的操作。
示例性的,第二操作可用于保存第一界面对应的第一图像或者保存第一图像对应的第一进度。在预设工作模式的操作说明中,针对第二操作对应的功能描述可以为:由上向下滑动进行截图,由下向上滑动保存进度。当然,可以理解的是,截图可以理解为保存第一界面对应的第一图像。
在一种可能的实施方式中,当所述显示屏为折叠屏时,在所述折叠屏的主屏上显示所述第一应用的第一界面,在所述折叠屏的副屏上显示第二应用的第二界面。
在本申请中,当显示屏为折叠屏时,例如可将显示屏可分为主屏和副屏,然后主屏上可显示第一应用的第一界面,副屏上可显示第二应用的第二界面。
在一种可能的实施方式中,当显示屏为折叠屏时,响应于所述第一操作之后,至少可以包括以下几种情况:
第一种:在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示所述第二界面以及预设工作模式的操作说明。
第二种:在所述折叠屏的主屏上显示所述第一界面以及预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面。
第三种:在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示预设工作模式的操作说明。
第四种:在所述折叠屏的主屏上显示预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面。
需要说明的是,当显示屏为折叠屏时,显示方式并不限于上述四种情况,也可以同时在主副屏上应用该模式,本申请对此不作限定。
第二方面,本申请还提供一种电子设备,该电子设备包括显示屏;一个或多个处理器;一个或多个存储器;一个或多个传感器;多个应用;以及一个或多个计算机程序;其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行上述第一方面及其第一方面任一可能设计的技术方案。
第三方面,本申请还提供一种电子设备,该电子设备包括执行第一方面或者第一方面的任意一种可能的设计的方法的模块/单元;这些模块/单元可以通过硬件实现,也可以通过硬件执行相应的软件实现。
第四方面,本申请实施例还提供一种芯片,所述芯片与电子设备中的存储器耦合,执行本申请实施例第一方面及其第一方面任一可能设计的技术方案;本申请实施例中“耦合”是指两个部件彼此直接或间接地结合。
第五方面,本申请实施例的一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行本申请实施例第一方面及其第一方面任一可能设计的技术方案。
第六方面,本申请实施例的中一种计算机程序产品,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行本申请实施例第一方面及其第一方面任一可能设计的技术方案。
附图说明
图1为本申请实施例提供的一种手机100的硬件结构图;
图2为本申请实施例提供的一种折叠屏的示意图;
图3A为本申请实施例提供的一种软件架构示意图;
图3B为本申请实施例提供的一种安卓操作系统数据流向示意图;
图4为本申请实施例提供的方法流程示意图;
图5A为本申请实施例提供的一种双屏联动模式设置界面示意图;
图5B为本申请实施例提供的一组手机100的界面示意图;
图5C~图5G为本申请实施例提供的手机100的界面示意图;
图6A~图6B为本申请实施例提供的手机100的界面示意图;
图7A~图7C为本申请实施例提供的手机100的界面示意图;
图8为本申请实施例提供的另一种电子设备的示意图。
具体实施方式
下面将结合本申请以下实施例中的附图,对本申请实施例中的技术方案进行详尽描述。
以下,首先对本申请实施例中的部分用语进行解释说明,以便于本领域技术人员理解。
本申请实施例涉及的应用程序(application,App),简称应用,为能够实现某项或多项特定功能的软件程序。通常,电子设备中可以安装多个应用。比如,相机应用、短信应用、邮箱应用、微信(WeChat)、WhatsApp Messenger、连我(Line)、照片分享(instagram)、Kakao Talk、钉钉等。下文中提到的应用,可以是电子设备出厂时已安装的应用,也可以是用户在使用电子设备的过程中从网络下载或其他电子设备获取的应用。
需要说明的是,本申请实施例提供的方法,可以适用于具有显示屏、且具有能够将显示屏划分为两个区域分别显示的电子设备,诸如手机、平板电脑、可穿戴设备(例如,手表、手环、智能头盔等)、车载设备、智能家居、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等,本申请实施例不作限定。本申请实施例涉及到的电子设备也可以是可折叠式的电子设备,比如可折叠式手机,可折叠式平板电脑等,本申请对此不作限定。
下面以手机为例,介绍手机的结构。
如图1所示,手机100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是手机100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为手机100充电,也可以用于手机100与外围设备之间传输数据。充电管理模块140用于从充电器接收充电输入。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。
手机100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
显示屏194用于显示应用的显示界面等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode, OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机100可以包括1个或N个显示屏194,N为大于1的正整数。
当显示屏194为柔性屏幕时,用户可对显示屏194进行折叠。例如参阅图2所示,当用户折叠显示屏194之后,显示屏194可被划分为两个显示区域,即显示区域1和显示区域2。并且显示屏194被划分出的两个显示区域之间呈一定夹角β。为了便于描述,可以将显示区域1称为手机100的主屏,将显示区域2称为手机100的副屏。主屏和副屏的显示面积可以相同或不同。
可以理解的是,用户可以沿显示屏194中的一条或多条折叠线折叠屏幕。其中,折叠线的位置可以是预先设置的,也可以是用户在显示屏194中任意选择的。
在本申请中,当主屏和副屏之间的夹角β大于设定阈值(例如170°)时,手机100可确定显示屏194处于展开状态。当主屏和副屏之间的夹角β小于设定阈值时,手机100可确定显示屏194处于折叠状态。本申请实施例中可将显示屏194的物理形态划分为展开状态和非展开状态。
摄像头193用于捕获静态图像或视频。摄像头193可以包括前置摄像头和后置摄像头。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,以及至少一个应用程序(例如爱奇艺应用,微信应用等)的软件代码等。存储数据区可存储手机100使用过程中所产生的数据(例如图像、视频等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将图片,视频等文件保存在外部存储卡中。
手机100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
触摸传感器180A,也称“触控面板”。触摸传感器180A可以设置于显示屏194,由触摸传感器180A与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180A用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180A也可以设置于手机100的表面,与显示屏194所处的位置不同。
需要说明的是,手机100中的传感器还可包括压力传感器180B、陀螺仪传感器180C、气压传感器180D、磁传感器180E、加速度传感器180F、距离传感器180G、接近光传感器180H、温度传感器180J、骨传导传感器180L等,当然,手机100还可包括按键190(例如开机键,音量键等)、马达191、指示器192、SIM卡接口195等。
可以理解的是,图1所示的部件并不构成对手机的具体限定,手机还可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。以下的实施例中,以图1所示的手机100为例进行介绍。
上述手机100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明手机100的软件结构。
图3A是本申请实施例的手机100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。如图3A所示,应用程序层内可以安装图库,日历,通话,地图,导航,System UI,第一区域Activity、第二区域Activity等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3A所示,应用程序框架层可以包括显示策略服务、截屏服务、显示管理服务(display manager service,DMS)。当然,应用程序框架层中还可以包括活动管理器、窗口管理服务(WMS),内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器,为窗口提供窗口管理服务(window manager service),以对显示屏显示的界面进行控制管理。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用的界面。
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
Android runtime包括核心库和虚拟机。Android runtime是Android操作系统的运行时环境,负责Android操作系统的调度和管理。
其中,核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是Android操作系统的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:状态检测模块、手势识别模块、截屏服务模块、媒体库(media libraries),表面管理器、图像处理库等。
状态检测模块,用于对电子设备的显示屏的物理形态进行识别。例如,状态检测模块可以用于根据硬件层中各类传感器上传的传感器数据确定该显示屏的物理形态。
手势识别模块,用于对用户在电子设备的显示屏上的手势操作进行识别。例如手势操作包括用户在显示屏上的触摸操作,例如滑动操作、点击操作、按压操作等。
媒体库支持多种格式的音频、视频的回放和录制,以及支持打开多种格式的静态图像等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的 融合。
图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
内核层是电子设备的硬件和软件之间的层。内核层至少包含显示驱动,传感器驱动、摄像头驱动,音频驱动等,用于驱动硬件层中的硬件。
硬件层可以包括各类传感器(例如触摸传感器等)、显示屏、摄像头等。
如图3B所示,为本申请实施例提供的一种安卓操作系统数据流向示意图。示例性的,System UI可对手势识别模块进行监听,当传感器检测到数据之后,可将检测到的数据上报给传感器驱动,然后传感器驱动再将检测到的数据上报给手势识别模块。当手势识别模块识别出手势操作之后,System UI可根据手势识别模块识别出的结果调用相应的服务,例如当手势识别模块识别出手势操作为对第二区域Activity的截屏操作时,可调用截屏服务,对第二区域Activity进行截屏,然后再将截屏之后的缩略图发送给第一区域Activity。
需要说明的是,本申请中的第一区域Activity表示显示屏194上的第一区域的显示界面,第二区域Activity表示显示屏194上的第二区域的显示界面,System UI表示系统级的应用,功能包括:状态栏信息显示、任务栏显示面板、通知面板等。
以下实施例以应用在图1所示的手机100所示的架构中为例进行描述。
此外,下述实施例涉及的至少一个,包括一个或者多个;其中,多个是指大于或者等于两个。另外,需要理解的是,在本申请的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的。
以下,将结合附图对本申请实施例提供的一种显示方法进行具体介绍。
如图4所示,为本申请实施例提供的方法流程图,参阅图4所示,该方法可包括如下步骤:
步骤401:设置双屏联动模式。
本申请实施例中,手机100设置一种新的工作模式,在该工作模式下,手机100的显示屏可划分为两个区域,即显示屏上可以同时显示两个窗口。为了描述方便,以下将该工作模式记为“双屏联动模式”,该工作模式也可为其他名称,本申请对此不做限定。
需要说明的是,在双屏联动模式下,手机100的显示屏划分为两个区域,并且在该模式启动之后的初始状态下,一个区域显示该双屏联动模式的操作说明,例如记为第一界面,另一个区域显示手机100的第二界面,例如可以为某个应用程序的操作界面,或者手机100的主界面等。
如图5A所示,为本申请实施例提供的一种双屏联动模式设置界面示意图,例如用户可点击手机上的“设置”应用,在设置界面上找到“系统和更新”界面500,然后在该界面上点击“双屏联动模式”501,进入“双屏联动模式”界面510,该界面上可显示双屏联动模式下的界面示意图,点击“应用”502,此时手机100可进入双屏联动模式。可以理解的是,“应用”502用于开启双屏联动模式。
需要说明的是,本申请实施例中,当手机100处于双屏联动模式时,显示屏被划分的两个区域的大小可以相同,也可以不同,本申请对此不作限定。
当然,可以理解的是,双屏联动模式的开关也可以设置在显示屏上的通知栏,用户可以通过手势操作(例如从屏幕的顶部向下滑动等)打开通知栏,然后直接点击通知栏的开关以启动双屏联动模式,或者用户也可以通过某个手势操作来启动该双屏联动模式等,本 申请中对此不作限定。
步骤402:显示屏的第一区域显示模式操作说明,第二区域显示手机界面。
以下为了描述方便,可将双屏联动模式下显示屏被划分的两个区域分别记为“第一区域”和“第二区域”。可以理解的是,第一区域和第二区域仅是为了区分显示屏上的两个区域,本申请中并不限定第一区域和第二区域的位置关系,第一区域和第二区域可以左右并排设置(如图5B左下图),或者上下设置(即,其中一个位于另一个上方)等,本申请对此不作限定。
在一些实施例中,将手机100设置为双屏联动模式之后,显示屏可被划分为第一区域和第二区域。其中,第一区域可显示该模式的操作说明,即第一界面,第二区域上可显示手机100的第二界面。需要说明的是,双屏联动模式的操作说明包括但不限于如下内容:在该模式下如何截图,如何删除截图,如何保存任务进度,如何退出等。例如图5B左下图显示的:1、向下滑动进行截图;2、点击全屏显示的“x”,退出全屏显示;3、点击缩略图的“x”,删除缩略图;4、向上滑动保存任务进度。
在另一些实施例中,第一区域也可以显示缩略图,所述缩略图的内容为第二界面的截图或者第二界面中的关键信息。可以理解的是,第二界面中的关键信息可以为第二界面中用户的操作步骤信息,例如用户在画图时每一步操作所形成的图案。
需要说明的是,当第一区域显示的是第二界面的截图时,该截图上可包括手机100的状态栏中的信息,例如时间信息、电量信息等。当第一区域显示的是第二界面中的关键信息时,缩略图中不显示手机100的状态栏中的信息。
作为一种可能的实现方式,在开启双屏联动模式之后,第二界面可以默认显示手机100的主界面,例如可参阅图5B所示。即不管在开启该模式之前,手机100的界面显示的是主界面还是显示的是某个应用程序的界面等,当开启该模式后,第二区域可直接显示手机100的主界面。
作为另一种可能的实现方式,在开启双屏联动模式之后,第二界面可以为用户在开启该模式之前正在显示的界面。例如,在开启该模式之前,手机100的界面如图5C所示的图库应用的界面520,则在开启该模式之后,第二区域显示的内容为该界面520。
需要说明的是,本申请中当用户从屏幕的顶部向下滑动打开通知栏,然后直接点击通知栏的开关启动双屏联动模式时,手机100显示的初始界面(打开该开关之前的界面)可显示在该通知栏的下层。可以理解的是,本申请附图中仅是一种示意,用户也可以通过语音控制的方式打开通知栏,或者也可以通过隔空操作(例如在屏幕上方通过某个手势操作)打开通知栏,或者也可以通过语音控制的方式启动双屏联动模式,或者通过隔空操作启动双屏联动模式等,本申请对此不作限定。
作为又一种可能的实现方式,当手机100为折叠屏手机时,以折叠屏手机的显示屏包括主屏和副屏(主屏位于副屏右边)为例,可包括如下几种情况:
第一种:在开启双屏联动模式之后,可以默认在副屏上应用该模式,主屏继续显示原来的界面(即开启双屏联动模式之前主屏正在显示的界面)。即在开启双屏联动模式之后,可将副屏划分为第一区域和第二区域,也就是说第二区域可以显示在副屏上,例如可参阅图5D所示。
第二种:在开启双屏联动模式之后,可以默认在主屏上应用该模式,副屏继续显示原来的界面(即开启双屏联动模式之前主屏正在显示的界面)。即在开启双屏联动模式之后, 可将主屏划分为第一区域和第二区域,也就是说第二区域可以显示在主屏上,例如可参阅图5E所示。
第三种:在开启双屏联动模式之后,副屏可以替换显示该模式的操作说明,主屏继续显示原来的界面(即开启双屏联动模式之前主屏正在显示的界面),例如可参阅图5F所示。
第四种:在开启双屏联动模式之后,主屏可以替换显示该模式的操作说明,副屏继续显示原来的界面(即开启双屏联动模式之前副屏正在显示的界面),例如可参阅图5G所示。
当然,可以理解的是,当手机为折叠屏手机时,开启双屏联动模式之后,也可以默认同时在主副屏上应用该模式,即同时将主屏和副屏分别划分为两个区域,本申请对此不作限定。
需要说明的是,在本申请中如果要退出双屏联动模式,可通过通知栏上的开关关闭该模式,或者进入设置关闭该模式即可,本申请对此不作限定。
步骤403:检测用户在第二区域的手势操作。
在本申请一些实施例中,手机100的硬件层中的传感器可将检测到的数据上报给驱动层的硬件驱动,然后驱动层的硬件驱动再将传感器检测到的数据上报给手势识别模块。示例性的,触摸传感器可在检测用户在第二区域的滑动操作对应的事件,将该滑动操作对应的事件上报给驱动层的传感器驱动,由传感器驱动将触摸传感器检测到的滑动操作上报给手势识别模块。
步骤404:若用户在第二区域的手势操作为由上向下的滑动操作,则对第二区域的内容进行截图,并将截图显示在第一区域。
以第二区域显示手机100的主界面为例,参阅图6A所示为本申请实施例提供的一组界面示意图,当手机100进入双屏联动模式之后,手机100的显示屏上可显示第一界面和第二界面,例如图6A中的(a)所示。其中,第一界面上显示双屏联动模式下的操作说明,第二界面上显示手机100的主界面。需要说明的是,图6A中所示的针对双屏联动模式的操作说明并不限于上述示意图所列出的说明,本申请对此不作限定。
假设用户在第二界面上点击应用图库,则可在第二界面上显示图库中的照片,例如图6A中(b)所示,当用户点击其中的一张照片(以下称为“照片1”)后,可在第二界面上全屏显示该照片1,例如图6A中(c)所示。若用户在第二界面的右边缘从上向下滑动,例如图6A中(d)所示,手机100的触摸传感器可在检测到该沿着显示屏右边缘从上向下的手势操作时,由手势识别模块识别出该手势操作为在第二界面上的滑动操作,并且该滑动操作为截图操作。在本申请中,当System UI监听到手势识别模块识别出的手势操作为对第二界面的截图操作时,可获取第二区域当前显示的界面,并调用应用程序框架层的截屏服务,对获取到的第二区域当前显示的界面(即第二界面)进行截图,得到第二界面的缩略图1(以下称为“截图1”),然后System UI可将截图1发送给第一区域Activity,例如参阅图6A中(d)的第一界面所示。
需要说明的是,用户可以根据自身实际需求将想要截取的屏幕进行截图,手机依次将这些截图显示在第一界面上,即第一界面上可显示至少一个截图。
接着,用户可在第二界面上继续滑动照片,例如图6B中(e)所示,用户在第二界面上从右向左滑动页面,比如滑动到下一张图片(以下称为“照片2”)时,例如图6B中(f)所示,如果用户想要对照浏览照片2和照片1,则用户可点击第一界面上的截图1,例如图6B中的(g)所示,然后第一界面上可全屏显示用户点击的截图1,第二界面上显示照 片2,例如图6B中的(h)所示。
在本申请一些实施例中,图6A中的(d)、图6B中的(e)~(h)的第一界面中,显示的截图的右上角有按钮“x”,用户可通过点击该按钮“x”退出全屏或者删除截图。例如,在图6A中的(d)所示的示意图中,用户点击“x”,则可删除该截图。如果用户在图6B中的(h)所示的示意图中点击第一界面上全屏显示的截图1,则可将截图1退出全屏显示。当然,可以理解的是,全屏显示也可以通过手势操作将截图进行放大或者缩小。
在本申请另一些实施例中,可对第一界面上显示的截图添加标签,以用于区分各个截图,例如上述示意图中在缩略图的左上方添加数字标签,例如1、2等,用来区分各个截图。当然,可在截图的左上角添加时间标签等,并且添加的标签也可以为其它形式,标签的位置也可以为其它位置,例如也可以添加数字标签来区分各个截图,本申请中对此不作限定。
需要说明的是,截图上添加的标签可以显示,也可以不显示,本申请对此不作限定。示例性的,当截图上添加的标签显示时,可参阅图6A的(d)所示的第一界面上的数字标签1。
在本申请实施例中,可直接将缩略图显示在第一区域上,这样可实现第一区域和第二区域的内容对照,并且无需用户打开两个应用,从而简化了操作步骤。
步骤405:若用户在第二区域的手势操作为由下向上的滑动操作,则将第二区域的关键信息进行保存,并显示在第一区域。需要说明的是,第二区域的关键信息可以理解为下面实施例中所涉及到的用户绘制的多个图案。
以第二区域显示的是用户在开启该模式之前的界面为例,参阅图7A所示为本申请实施例提供的一组界面示意图。当手机100开启双屏联动模式之后,手机100的显示屏上可显示第一界面和第二界面,例如图7A中(a)所示。其中,第一界面显示双屏联动模式的操作说明,第二界面显示开启该模式之前的界面,例如某一个画图应用的界面。
当用户在图7A中(a)所示示意图中点击“添加”701按钮,用户可新建一个草稿,例如此时用户可以在新建的草稿中进行编辑,假设用户在新建的草稿中绘制了如图7A中(b)所示的图案(以下称为“图案1”),则此时如果用户想要将当前的图案1进行保存,则可根据双屏联动模式下的操作说明中的手势操作,例如,图7A中的(b)所示,沿着显示屏右边缘由下向上滑动的滑动操作。手机100的硬件层中的传感器当检测到该沿着显示屏右边缘由下向上滑动的滑动操作之后,由手势识别模块识别出该沿着显示屏右边缘由下向上滑动的滑动操作为保存任务进度的手势操作。当System UI监听到该手势操作之后,可获取第二区域上当前的任务进度,并调用框架层的任务管理服务,然后通过任务管理服务中的保存Activity状态的接口对获取到的第二区域上当前的任务进度(即第二界面上的图案1的进度)进行进度保存(例如可将保存的图案1的进度记为进度1)。然后System UI可将进度1发送给第一区域Activity,例如图7A中的(c)所示。
需要说明的是,进度保存可以理解为对用户在界面上的操作步骤进行保存。
接着,用户可以继续在第二界面上进行下一步操作,假设用户在第二界面上画了下一笔以形成更新的图案,例如图7A中(d)所示的第二界面上的图案2,此时如果用户想要保存当前的图案2的进度(例如记为进度2),则可沿着显示屏右边缘由下向上滑动,例如图7B中(e)所示,然后手势识别模块可识别出该沿着显示屏右边缘由下向上滑动的滑动操作为保存任务进度的手势操作。当System UI监听到该手势操作之后,可获取第二区域 上当前的任务进度(即进度2),并调用框架层的任务管理服务,然后通过任务管理服务中的保存Activity状态的接口对获取到的第二区域上当前的任务进度(即进度2)进行进度保存。然后System UI可将进度2发送给第一区域Activity,例如可参阅图7B中(f)所示的第一界面,用户可继续在第二界面上进行下一步操作,例如操作之后的界面如图7B中(g)所示。
步骤4051:若用户在第一区域上的手势操作为由左向右的滑动操作,则将第一区域的内容恢复显示在第二区域。
接着,用户在第二界面继续操作的时候,如果想要恢复到之前的步骤,比如想要恢复到进度1,则本申请实施例中可采用如下两种方式:
A方式:参阅图7B中的(h1)所示,用户可在第一界面上点击进度1,并将进度1从左向右滑动,手机100的传感器检测到用户在第一界面上从左向右的滑动操作之后,手势识别模块可识别出该从左向右的滑动操作为恢复任务进度的操作。当System UI监听到该从左向右的滑动操作之后,可调用框架层的任务管理服务,并通过恢复Activity状态的接口将进度1恢复到第二界面,即第二界面上显示进度1的操作界面,例如下面的图7C中的(k)所示。
B方式:用户可在第一界面上点击进度1,参阅图7C中的(h2)所示,然后第一界面上可全屏显示进度1,参阅图7C中的(i)所示,此时用户在图7C中的(i)所示的第一界面上,将进度1从左向右滑动,手机100的传感器检测到用户在第一界面上从左向右的滑动操作之后,手势识别模块可识别出该从左向右的滑动操作为恢复任务进度的操作。当System UI监听到该从左向右的滑动操作之后,可调用框架层的任务管理服务,并通过恢复Activity状态的接口将进度1恢复到第二界面,即第二界面上显示进度1的操作界面,即第二界面上恢复显示进度1的操作界面,例如图7C中的(k)所示。
需要说明的是,在本申请实施例中,如果用户想要恢复之前的进度,那么在将之前的进度恢复之后,之前的进度仍然可以保存显示在第一界面,(例如可以以副本的形式保存在第一界面),这样也可以方便用户后续的操作,比如,用户将之前的进度(比如进度1)恢复之后,继续操作,当进行到某一步时又想从进度1重新开始,则还可以将进度1再次恢复到第二界面。
在本申请中,通过上述方法,可以将用户操作过程中的进度进行对照,并且可以随时恢复到之前的步骤,提升了用户体验。
为了实现上述本申请实施例提供的方法中的各功能,移动终端设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
上述本申请提供的实施例中,从电子设备作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,终端设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
如图8所示,本申请另外一些实施例公开了一种电子设备,该电子设备可以是具有显示屏的电子设备。参阅图8所示,所述电子设备800包括:显示屏801;一个或多个处理 器802;一个或多个存储器803;一个或多个传感器804、多个应用805(图中未示出);以及一个或多个计算机程序806(图中未示出),上述各器件可以通过一个或多个通信总线807连接。
其中,显示屏801用于显示主界面,或者电子设备中的应用的显示界面,或者双屏联动模式的操作说明。存储器803中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令;处理器802调用存储器803中存储的所述指令,使得电子设备800可以执行以下步骤:
显示屏801显示第一应用的第一界面;检测到用户的第一操作;响应于所述第一操作,在所述显示屏801的第一区域显示所述第一界面,在所述显示屏801的第二区域显示第二界面;检测到用户的第二操作;响应于所述第二操作,保存所述第一界面对应的第一图像,并将所述第一图像的缩略图显示在所述第二界面。
一种可能的实现方式中,所述第二界面上还显示所述第一图像的缩略图对应的文字描述。
一种可能的实现方式中,当所述指令被所述一个或多个处理器802调用执行时,使得所述电子设备还执行以下步骤:
检测到作用于所述第一图像的缩略图的第三操作,显示屏801的第二区域显示所述第一图像。
一种可能的实现方式中,所述第一图像为用户操作所述第一界面到达第一进度时保存的图像;当所述指令被所述一个或多个处理器802调用执行时,使得所述电子设备还执行以下步骤:
响应于所述第二操作,保存所述第一图像对应的第一进度。
一种可能的实现方式中,当所述指令被所述一个或多个处理器802调用执行时,使得所述电子设备在所述第二区域显示所述第一图像之后还执行以下步骤:
检测到作用于所述第一界面的第四操作;响应于所述第四操作,在所述显示屏的第一区域显示第三界面,所述第三界面对应的图像为第二图像,所述第二图像为用户操作所述第一界面到达第二进度时对应的图像,且所述第二进度为所述第一进度之后的进度;检测到作用于所述第一图像的第五操作;响应于所述第五操作,在所述显示屏的第一区域显示第一图像。
一种可能的实现方式中,所述第一操作包括:从所述显示屏的通知栏,启动预设工作模式的操作。
一种可能的实现方式中,所述第二操作包括:在所述第一区域从上向下的滑动操作,或者在所述第一区域从下向上的滑动操作。
一种可能的实现方式中,所述第三操作包括:点击操作、手势扩张操作。
一种可能的实现方式中,所述第五操作包括:从第二区域向第一区域的滑动操作。
一种可能的实现方式中,所述显示屏801的第二界面上还用于显示预设工作模式的操作说明,所述预设工作模式的操作说明包括所述第一操作、所述第二操作、所述第三操作、所述第五操作分别对应的功能描述中的至少一项。
一种可能的实现方式中,当所述显示屏801为折叠屏时,在所述折叠屏801的主屏上显示所述第一应用的第一界面,在所述折叠屏的副屏上显示第二应用的第二界面。
一种可能的实现方式中,当所述指令被所述一个或多个处理器802调用执行时,使得 所述电子设备响应于所述第一操作之后还执行以下步骤:
在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示所述第二界面以及预设工作模式的操作说明;或者在所述折叠屏的主屏上显示所述第一界面以及预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面;或者在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示预设工作模式的操作说明;或者在所述折叠屏的主屏上显示预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面。
在本申请实施例中,处理器802可以是通用处理器、数字信号处理器、专用集成电路、现场可编程门阵列或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件,可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于存储器803中,处理器802读取存储器803中的程序指令,结合其硬件完成上述方法的步骤。
在本申请实施例中,存储器803可以是非易失性存储器,比如硬盘(hard disk drive,HDD)或固态硬盘(solid-state drive,SSD)等,还可以是易失性存储器(volatile memory),例如RAM。存储器还可以是能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。本申请实施例中的存储器还可以是电路或者其它任意能够实现存储功能的装置,用于存储指令和/或数据。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
基于以上实施例,本申请还提供了一种计算机存储介质,所述计算机存储介质中存储有计算机程序,所述计算机程序被计算机执行时,使得所述计算机执行以上实施例提供的显示方法。
本申请实施例中还提供一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行以上实施例提供的显示方法。
本申请实施例是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。

Claims (15)

  1. 一种显示方法,应用于具有显示屏的电子设备,其特征在于,包括:
    显示第一应用的第一界面;
    检测到用户的第一操作;
    响应于所述第一操作,在所述显示屏的第一区域显示所述第一界面,在所述显示屏的第二区域显示第二界面;
    检测到用户的第二操作;
    响应于所述第二操作,保存所述第一界面对应的第一图像,并将所述第一图像的缩略图显示在所述第二界面。
  2. 如权利要求1所述的方法,其特征在于,所述第二界面上还显示所述第一图像的缩略图对应的文字描述。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    检测到作用于所述第一图像的缩略图的第三操作,在所述第二区域显示所述第一图像。
  4. 如权利要求3所述的方法,其特征在于,所述第一图像为用户操作所述第一界面到达第一进度时保存的图像;
    所述方法还包括:
    响应于所述第二操作,保存所述第一图像对应的第一进度。
  5. 如权利要求4所述的方法,其特征在于,在所述第二区域显示所述第一图像之后,所述方法还包括:
    检测到作用于所述第一界面的第四操作;
    响应于所述第四操作,在所述显示屏的第一区域显示第三界面,所述第三界面对应的图像为第二图像,所述第二图像为用户操作所述第一界面到达第二进度时对应的图像,且所述第二进度为所述第一进度之后的进度;
    检测到作用于所述第一图像的第五操作;
    响应于所述第五操作,在所述显示屏的第一区域显示第一图像。
  6. 如权利要求1所述的方法,其特征在于,所述第一操作包括:从所述显示屏的通知栏,启动预设工作模式的操作。
  7. 如权利要求1所述的方法,其特征在于,所述第二操作包括:在所述第一区域从上向下的滑动操作,或者在所述第一区域从下向上的滑动操作。
  8. 如权利要求3所述的方法,其特征在于,所述第三操作包括:点击操作、手势扩张操作。
  9. 如权利要求5所述的方法,其特征在于,所述第五操作包括:从第二区域向第一区域的滑动操作。
  10. 如权利要求5所述的方法,其特征在于,所述第二界面上还用于显示预设工作模式的操作说明,所述预设工作模式的操作说明包括所述第一操作、所述第二操作、所述第三操作、所述第五操作分别对应的功能描述中的至少一项。
  11. 如权利要求1所述的方法,其特征在于,当所述显示屏为折叠屏时,在所述折叠屏的主屏上显示所述第一应用的第一界面,在所述折叠屏的副屏上显示第二应用的第二界面。
  12. 如权利要求11所述的方法,其特征在于,响应于所述第一操作之后,所述方法还包括:
    在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示所述第二界面以及预设工作模式的操作说明;或者
    在所述折叠屏的主屏上显示所述第一界面以及预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面;或者
    在所述折叠屏的主屏上显示所述第一界面,在所述折叠屏的副屏上显示预设工作模式的操作说明;或者
    在所述折叠屏的主屏上显示预设工作模式的操作说明,在所述折叠屏的副屏上显示所述第二界面。
  13. 一种电子设备,其特征在于,所述电子设备包括显示屏;一个或多个处理器;一个或多个存储器;一个或多个传感器;多个应用;以及一个或多个计算机程序;
    其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行如权利要求1至12任一项所述的显示方法。
  14. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-12中任一项所述的显示方法。
  15. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-12中任一项所述的显示方法。
PCT/CN2021/074694 2020-02-04 2021-02-01 一种显示方法及电子设备 WO2021155770A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21750469.5A EP4086740A4 (en) 2020-02-04 2021-02-01 DISPLAY METHOD AND ELECTRONIC DEVICE
US17/817,087 US20220374118A1 (en) 2020-02-04 2022-08-03 Display Method and Electronic Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010079955.6A CN111338519B (zh) 2020-02-04 2020-02-04 一种显示方法及电子设备
CN202010079955.6 2020-02-04

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/817,087 Continuation US20220374118A1 (en) 2020-02-04 2022-08-03 Display Method and Electronic Device

Publications (1)

Publication Number Publication Date
WO2021155770A1 true WO2021155770A1 (zh) 2021-08-12

Family

ID=71181468

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/074694 WO2021155770A1 (zh) 2020-02-04 2021-02-01 一种显示方法及电子设备

Country Status (4)

Country Link
US (1) US20220374118A1 (zh)
EP (1) EP4086740A4 (zh)
CN (1) CN111338519B (zh)
WO (1) WO2021155770A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338519B (zh) * 2020-02-04 2022-05-06 华为技术有限公司 一种显示方法及电子设备
CN114679511A (zh) * 2020-12-24 2022-06-28 荣耀终端有限公司 操作控制方法、装置和电子设备
CN115016694A (zh) * 2021-11-18 2022-09-06 荣耀终端有限公司 一种应用程序启动方法及电子设备
CN116450012A (zh) * 2022-01-10 2023-07-18 荣耀终端有限公司 一种电子设备的控制方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120131479A1 (en) * 2004-06-24 2012-05-24 Apple Inc. Resolution Independent User Interface Design
CN107066171A (zh) * 2011-03-21 2017-08-18 广州市动景计算机科技有限公司 触屏终端的多窗口切换方法和系统
CN108259973A (zh) * 2017-12-20 2018-07-06 青岛海信电器股份有限公司 智能电视及电视画面截图的图形用户界面的显示方法
CN109710127A (zh) * 2018-12-19 2019-05-03 维沃移动通信有限公司 一种截屏方法及移动终端
CN111338519A (zh) * 2020-02-04 2020-06-26 华为技术有限公司 一种显示方法及电子设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101788051B1 (ko) * 2011-01-04 2017-10-19 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20180017746A (ko) * 2016-08-10 2018-02-21 엘지전자 주식회사 이동 단말기 및 그 제어방법
US10783320B2 (en) * 2017-05-16 2020-09-22 Apple Inc. Device, method, and graphical user interface for editing screenshot images
CN108055572A (zh) * 2017-11-29 2018-05-18 努比亚技术有限公司 移动终端的控制方法、移动终端及计算机可读存储介质
CN109271081B (zh) * 2018-07-28 2019-09-20 华为技术有限公司 滚动截屏的方法及电子设备
CN109189534A (zh) * 2018-08-29 2019-01-11 维沃移动通信有限公司 一种截屏方法及终端设备
CN109407936B (zh) * 2018-09-21 2021-07-16 Oppo(重庆)智能科技有限公司 截图方法及相关装置
CN109683777B (zh) * 2018-12-19 2020-11-17 维沃移动通信有限公司 一种图像处理方法及终端设备
CN110109593B (zh) * 2019-04-29 2021-04-02 维沃移动通信有限公司 一种截屏方法及终端设备
CN113407089A (zh) * 2019-08-26 2021-09-17 华为技术有限公司 一种语音控制的分屏显示方法及电子设备
CN110658971B (zh) * 2019-08-26 2021-04-23 维沃移动通信有限公司 一种截屏方法及终端设备
CN110597612B (zh) * 2019-09-26 2023-03-24 三星电子(中国)研发中心 智能设备任务管理的方法和装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120131479A1 (en) * 2004-06-24 2012-05-24 Apple Inc. Resolution Independent User Interface Design
CN107066171A (zh) * 2011-03-21 2017-08-18 广州市动景计算机科技有限公司 触屏终端的多窗口切换方法和系统
CN108259973A (zh) * 2017-12-20 2018-07-06 青岛海信电器股份有限公司 智能电视及电视画面截图的图形用户界面的显示方法
CN109710127A (zh) * 2018-12-19 2019-05-03 维沃移动通信有限公司 一种截屏方法及移动终端
CN111338519A (zh) * 2020-02-04 2020-06-26 华为技术有限公司 一种显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4086740A4 *

Also Published As

Publication number Publication date
CN111338519A (zh) 2020-06-26
EP4086740A4 (en) 2023-07-12
EP4086740A1 (en) 2022-11-09
US20220374118A1 (en) 2022-11-24
CN111338519B (zh) 2022-05-06

Similar Documents

Publication Publication Date Title
WO2021155770A1 (zh) 一种显示方法及电子设备
WO2021057868A1 (zh) 一种界面切换方法及电子设备
WO2021036628A1 (zh) 一种具有折叠屏的设备的触控方法与折叠屏设备
US20200257411A1 (en) Method for providing user interface related to note and electronic device for the same
WO2022089208A1 (zh) 一种文件拖拽方法及电子设备
WO2021110133A1 (zh) 一种控件的操作方法及电子设备
WO2022222752A1 (zh) 一种显示方法及相关装置
WO2022135186A1 (zh) 设备控制方法和终端设备
WO2022179249A1 (zh) 功能页面显示方法及电子设备
WO2021218365A1 (zh) 标注方法及电子设备
WO2023221946A1 (zh) 一种信息的中转方法及电子设备
WO2023029985A1 (zh) 一种桌面中停靠栏的显示方法及电子设备
WO2021197242A1 (zh) 复制粘贴的方法、电子设备及系统
WO2023078088A1 (zh) 一种显示方法及电子设备
WO2023226994A1 (zh) 一种内容摘录方法及设备
WO2023160208A1 (zh) 图像删除操作的通知方法、设备和存储介质
WO2024027504A1 (zh) 一种应用显示方法及电子设备
CN114625303B (zh) 窗口显示方法、终端设备及计算机可读存储介质
WO2023051354A1 (zh) 一种分屏显示方法及电子设备
WO2023045774A1 (zh) 显示方法及电子设备
WO2023029993A1 (zh) 一种搜索方法和电子设备
WO2024007966A1 (zh) 一种多窗口显示方法及设备
WO2023071590A1 (zh) 输入控制方法及电子设备
WO2023226975A1 (zh) 一种显示方法与电子设备
WO2024037346A1 (zh) 页面管理方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21750469

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021750469

Country of ref document: EP

Effective date: 20220805

NENP Non-entry into the national phase

Ref country code: DE