CN117632323A - Display method and electronic equipment - Google Patents
Display method and electronic equipment Download PDFInfo
- Publication number
- CN117632323A CN117632323A CN202210990769.7A CN202210990769A CN117632323A CN 117632323 A CN117632323 A CN 117632323A CN 202210990769 A CN202210990769 A CN 202210990769A CN 117632323 A CN117632323 A CN 117632323A
- Authority
- CN
- China
- Prior art keywords
- interface
- window
- application
- electronic device
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000015654 memory Effects 0.000 claims description 34
- 230000004044 response Effects 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 15
- 230000000284 resting effect Effects 0.000 claims 1
- 239000010410 layer Substances 0.000 description 25
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 19
- 238000007726 management method Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 9
- 210000000988 bone and bone Anatomy 0.000 description 5
- 238000013461 design Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000012792 core layer Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a display method and electronic equipment, which are applied to the technical field of terminals. According to the method, the user can trigger and open the multi-task interface on the current task interface and then simultaneously display the multi-task interface and the current task interface, so that the problem that the current task is interrupted due to the triggering and displaying of the multi-task interface in the prior art is solved, and the user experience can be improved.
Description
Technical Field
The application relates to the technical field of terminals, in particular to a display method and electronic equipment.
Background
Currently, a multi-task interface of an electronic device includes pages corresponding to a plurality of Applications (APP) running on the electronic device, and a user can quickly switch a used application on the multi-task interface and enter an application corresponding to the page. However, if the user triggers the multi-task interface, the task on the current interface is interrupted, for example, the user is watching the video, and if the multi-task interface is triggered, the current video is paused, so that the user experience is poor.
Disclosure of Invention
The application provides a display method and electronic equipment, which are used for avoiding interrupting a current task when a multi-task interface is triggered and improving user experience.
In a first aspect, the present application provides a display method, which is applicable to an electronic device. Specifically, the method comprises the following steps: the electronic device displays a first interface and then, when a first operation on the first interface for triggering the display of the multi-tasking window is detected, displays a second interface in response to the first operation, the second interface may include a first portion of the first interface and the multi-tasking window. Wherein the multitasking window may replace a portion of the first interface other than the first portion, or the multitasking window may be displayed at an upper layer of the first interface covering (obscuring) a portion of the first interface other than the first portion.
For example, the first operation may be an operation in which the two fingers slide upward along the lower edge of the display screen by a set distance. The sliding operation of the two fingers can avoid the operation conflict with the operation of calling up the multi-task window in the prior art, and the sliding operation of the three fingers or the sliding operation of the single finger can be naturally adopted, so that the method is not limited.
Through the technical scheme, the multi-task window can be triggered on the current display interface, then the multi-task window replaces or shields part of the content on the first interface, the multi-task window and the part outside the part of the content replaced or shielded on the first interface form the second interface, so that the current display interface can be prevented from being interrupted when the current display interface triggers the multi-task window, and the user experience can be improved.
In one possible implementation, the first interface is an application interface or desktop.
Case 1: the first interface is an application interface
In one possible implementation, when a video being played is displayed on the first interface, after the second interface is displayed, the video is not paused, i.e., the video continues to play.
Through the technical scheme, the user can trigger the multi-task window on the current task interface, such as the video playing interface, and continue playing the video after the second interface is displayed, so that the current task is not interrupted, and the user experience can be improved.
In one possible implementation, when the electronic device detects an operation of the user on the video, the electronic device may pause playing the video in response to the operation of the user on the video.
By the technical scheme, after the multi-task window is triggered, the current task is not interrupted, and the user can continue to operate on the current task interface, for example, click operation for suspending video playing can be performed on the video playing interface.
Case 2: the first interface is a desktop (Main interface)
In a possible implementation manner, when the first interface is a desktop, after displaying the second interface, the electronic device may further respond to an operation of a user on any application icon on the desktop, and open an application corresponding to the application icon.
Through the technical scheme, after the user triggers the multi-task window on the desktop, the user can continue to operate on the desktop, for example, the user can click on the application icon and open the application corresponding to the application icon.
In one possible implementation, the multitasking window includes a window of a plurality of applications currently running by the electronic device, or includes a different window of the application displayed in the first interface. That is, different application windows may be displayed in the multitasking window, or different windows of a single application may be displayed.
In one possible implementation manner, when the multitasking window includes different windows of the application displayed in the first interface, the interface corresponding to the different windows may be an interface screened by the electronic device using the preset condition.
In one possible implementation, when the electronic device detects a second operation on one of the multi-tasking windows, such as the first window, the first window may be triggered to be in an edit state in response to the second operation.
The second operation may be a long press operation for a first window of the multi-tasking windows, for example.
Through the technical scheme, the user can execute long-press operation on the window in the multi-task window, so that the window is in an editing state, and the subsequent user can continue drag operation or delete operation on the window.
In one possible implementation, when the electronic device detects a third operation on one of the multitasking windows, such as the first window, the electronic device may respond to the third operation to display the interface corresponding to the first window and the first interface in a split screen manner.
The third operation may be, for example, a drag operation on the first window.
Through the technical scheme, a user can drag the window in the multi-task window, then the electronic equipment can respond to the drag operation to split screen display the interface of the multi-task window and the first interface, and thus the split screen efficiency can be improved.
In one possible implementation, when the electronic device detects a fourth operation on the first window, such as an operation of sliding the two fingers and staying on the first window, the first window may be displayed in an enlarged manner in response to the fourth operation. Then, when the electronic device detects a fifth operation on the first window, such as a double-finger loosening operation, the first interface may be switched and displayed as an interface corresponding to the first window in response to the fifth operation.
By the technical scheme, on the premise of not interrupting the current task experience, the switching between the current task interface and the window corresponding interface in the multi-task window can be further realized, and the user experience can be improved.
In one possible implementation, when the electronic device detects a first operation on the first interface for triggering the display of the multi-tasking window, a second interface may be displayed in response to the first operation, the second interface may include the first interface and the multi-tasking window. That is, the second interface may be composed of the entire content of the first interface and the multi-task window, which is equivalent to shrinking the entire first interface, so that the height occupied by the first interface on the display screen is reduced, and then the multi-task window may be displayed at the remaining position of the first interface.
In a second aspect, the present application provides an electronic device comprising a display screen; one or more processors; one or more memories; one or more sensors; a plurality of applications; and one or more computer programs; wherein the one or more computer programs are stored in the one or more memories, the one or more computer programs comprising instructions which, when executed by the one or more processors, cause the electronic device to perform the method of any of the above-described first aspects and any of the possible designs of the first aspect.
In a third aspect, the present application also provides an electronic device comprising modules/units performing the method of the first aspect or any one of the possible designs of the first aspect; these modules/units may be implemented by hardware, or may be implemented by hardware executing corresponding software.
In a fourth aspect, the present application also provides a computer readable storage medium having instructions stored therein which, when run on an electronic device, cause the electronic device to perform the method of the first aspect and any one of the possible designs of the first aspect thereof.
In a fifth aspect, the present application also provides a computer program product which, when run on an electronic device, causes the electronic device to perform the method of the first aspect of the embodiments of the present application and any one of the possible designs of the first aspect thereof.
The technical effects of each of the second to fifth aspects and the technical effects that may be achieved by each of the aspects are referred to the technical effects that may be achieved by each of the possible aspects of the first aspect, and the detailed description is not repeated here.
Drawings
FIG. 1 is a schematic diagram of a user interface;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a software architecture according to an embodiment of the present application;
FIG. 4 is a flowchart of a display method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a user interface according to an embodiment of the present application;
FIG. 7A is a schematic diagram of a user interface according to an embodiment of the present application;
FIG. 7B is a schematic diagram of a user interface according to an embodiment of the present application;
FIG. 8 is a flowchart of a display method according to an embodiment of the present application;
FIG. 9 is a diagram of a user interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a user interface according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail below with reference to the drawings in the following embodiments of the present application.
Taking a mobile phone as an example, currently, a user can trigger the mobile phone to enter a multi-task interface by sliding upwards from the bottom edge of a screen of the mobile phone and stopping, and then the user can switch applications on the multi-task interface. And, the user may close at least one application being run at the multi-tasking interface. For example, as shown in fig. 1, assuming that the interface currently displayed by the mobile phone is an interface 10, a video playing interface of a video APP is displayed in the interface 10, if the user operates according to the gesture direction shown in the figure, the mobile phone may enter a multi-task interface, such as a displayable interface 11, in response to the gesture operation. The interface 11 may include a plurality of APP currently operated by the mobile phone, and the user may slide left and right to select and then click on a page (card) to select and switch applications.
As can be seen from the above procedure and schematic diagram: when the user triggers from the video playing interface to enter the multi-task interface, the video pauses to play, namely if the user jumps from the current task to the multi-task interface, the current task is interrupted, so that the user experience is poor.
In view of this, the embodiment of the present application provides a display method, in which a user may trigger to open a multi-task interface on a current task interface, and then simultaneously display the multi-task interface and the current task interface, so as to avoid interrupting a current task due to triggering to display the multi-task interface, thereby improving user experience.
It should be understood that an application program (abbreviated application) to which an embodiment of the present application relates is a software program capable of implementing some or more specific functions. Typically, a plurality of applications may be installed in an electronic device. Such as camera applications, text messaging applications, mailbox applications, video applications, music applications, etc. The application mentioned below may be an application installed when the electronic device leaves the factory, or may be an application downloaded from a network or acquired by a user from other electronic devices during use of the electronic device.
It should be noted that the display method provided in the embodiments of the present application may be applicable to any electronic device having a display screen, such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, a smart helmet, a smart glasses, etc.), a vehicle-mounted device, an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), etc., which are not limited in the embodiments of the present application. The electronic device according to the embodiment of the present application may be a foldable electronic device, such as a foldable mobile phone, a foldable tablet computer, etc., which is not limited in this application. Also, exemplary embodiments of the electronic device include, but are not limited to, piggybacking Harmony/>Or other operating system electronic devices.
The structure of the electronic device is described below using a tablet pc as an example.
As shown in fig. 2, the tablet 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a user identification module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural hub and a command center of the tablet computer 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the tablet pc 100, and may also be used to transfer data between the tablet pc 100 and peripheral devices. The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the tablet computer 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in tablet 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on the tablet computer 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied on the tablet computer 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of tablet computer 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that tablet computer 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include a global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), fifth generation (the fifth generation, 5G) mobile communication systems, future communication systems such as sixth generation (6th generation,6G) systems, etc., BT, GNSS, WLAN, NFC, FM and/or IR techniques, etc. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The display 194 is used to display a display interface of an application or the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, tablet 100 may include 1 or N display screens 194, N being a positive integer greater than 1. In embodiments of the present application, the display 194 may be used to display a host interface, an application interface, a multi-tasking window, and the like.
The camera 193 is used to capture still images or video. The camera 193 may include a front camera and a rear camera.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the tablet computer 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, and software code of at least one application program (e.g., an aiqi application, a WeChat application, etc.), etc. The storage data area may store data (e.g., images, videos, etc.) generated during use of the tablet computer 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the tablet 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as pictures and videos are stored in an external memory card.
Tablet 100 may implement audio functionality through audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, and an application processor, among others. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine a motion gesture of the tablet computer 100. In some embodiments, the angular velocity of tablet 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B.
The gyro sensor 180B may be used to determine a motion gesture of the tablet computer 100. In some embodiments, the angular velocity of tablet 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the tablet pc 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the tablet pc 100 through the reverse motion, thereby realizing anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, tablet 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The tablet pc 100 can detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the tablet 100 is a flip machine, the tablet 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the tablet pc 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the tablet 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The tablet 100 may measure distance by infrared or laser. In some embodiments, shooting a scene, tablet 100 may range using distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The tablet pc 100 emits infrared light outward through the light emitting diode. Tablet 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the tablet computer 100. When insufficient reflected light is detected, the tablet computer 100 may determine that there is no object in the vicinity of the tablet computer 100. The tablet computer 100 can detect that the user holds the tablet computer 100 close to the ear to talk by using the proximity light sensor 180G, so as to automatically extinguish the screen to achieve the purpose of saving electricity. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The tablet 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect if tablet 100 is in a pocket to prevent false touches. The fingerprint sensor 180H is used to collect a fingerprint. The tablet computer 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, tablet 100 performs a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, tablet 100 performs a reduction in the performance of a processor located near temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the tablet 100 heats the battery 142 to avoid the low temperature causing the tablet 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the tablet 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the tablet pc 100 at a different location than the display 194. For example, the touch sensor 180K may detect a touch operation of the user on the display screen, such as a sliding operation of the user on the display screen (e.g., a sliding operation from bottom to top from a bottom edge of the display screen), and then the tablet 100 may display the multi-tasking interface in response to the sliding operation. For another example, the touch sensor 180K may detect a long press operation by the user on the multitasking window, and then the tablet computer 100 may cause a plurality of windows on the multitasking window to enter an editing state in response to the long press operation.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. Tablet computer 100 may receive key inputs, generating key signal inputs related to user settings and function controls of tablet computer 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation of the tablet computer 100.
It will be appreciated that the components shown in fig. 2 are not limiting in detail and that the handset may also include more or fewer components than shown, or may be combined with or separated from certain components, or may be arranged in different components. In the following embodiments, a tablet pc 100 shown in fig. 2 is taken as an example to describe.
The software system of the tablet 100 may employ a layered architecture, including an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking an Android system with a layered architecture as an example, the embodiment of the application illustrates a software structure of the tablet computer 100. It should be understood that the system in the embodiments of the present application may also be a hong system, which is not limited in this application.
The software architecture of the electronic device is described below in connection with different scenarios. Fig. 3 is a software block diagram of the tablet 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (ART) and native C/c++ libraries, a hardware abstraction layer (hardware abstract layer, HAL), and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 3, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, an activity manager, an input manager, and so forth.
The window manager provides, among other things, window management services (window manager service, WMS) that may be used for window management, window animation management, surface management, and as a transfer station to an input system.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The activity manager may provide activity management services (Activity Manager Service, AMS) that may be used for system component (e.g., activity, service, content provider, broadcast receiver) start-up, handoff, scheduling, and application process management and scheduling tasks.
The input manager may provide input management services (Input Manager Service, IMS), which may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, and the like. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The android runtime includes a core library and An Zhuoyun rows. The android runtime is responsible for converting source code into machine code. Android runtime mainly includes employing Advanced Or Time (AOT) compilation techniques and Just In Time (JIT) compilation techniques.
The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structures, mathematics, IO, tools, databases, networks and the like. The core library provides an API for the user to develop the android application.
The native C/c++ library may include a plurality of functional modules. For example: surface manager (surface manager), media Framework (Media Framework), libc, openGL ES, SQLite, webkit, etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media frames support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES provides for drawing and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of the electronic device 100.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below based on fig. 2 and 3.
In the embodiment of the present application, when the touch sensor 180K receives the sliding operation of the user on the display screen, the corresponding hardware interrupt is sent to the kernel layer. The kernel layer may process the sliding operation into the original input event and store the event. The event may include, among other things, a start position coordinate of the sliding operation, an end position coordinate of the sliding operation. The application framework layer acquires an original input event from the kernel layer, identifies the input event, and if the event is identified as a sliding operation on the trigger multi-task window, the electronic device responds to the sliding operation to simultaneously display the multi-task window on the current interface.
The following embodiments are described by taking an example of an architecture applied to the tablet computer 100 shown in fig. 2.
Furthermore, at least one of the following embodiments is directed to, including one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions.
Referring to fig. 4, a flowchart of a display method according to an embodiment of the present application is shown in fig. 4, and the method may include the following steps:
s401, the tablet computer 100 displays a first interface.
The first interface may be a main interface (also referred to as a desktop), or may be an application interface of an application, for example, an interface of a short message application or an interface of an instant messaging application, which is not limited in this application.
S402: the tablet 100 detects a first operation on the first interface.
In some embodiments, the first operation may be a sliding operation on the first interface, such as sliding along a lower edge of the display screen to an upper side of the display screen. The sliding operation may be an operation of sliding upward a certain distance (for example, at the X position of the display screen) from the position of the lower edge of the display screen as a starting point. Alternatively, the sliding operation may be a sliding operation from any position on the display screen as a starting point, for example, from a position near the lower edge of the display screen, but a position at a distance from the lower edge. Of course, the first operation may also be a sliding operation from right to left/left to right/top to bottom, which is not limited in this application.
In other embodiments, the first operation may also be a sliding operation from right to left/top to bottom/bottom to top, and the multitasking window is displayed in response to the sliding operation when the electronic device detects a sliding operation from bottom to top, and the pressure sensor provided on the display screen detects that the pressure value generated by the sliding operation is greater than the threshold value. In other embodiments, the first operation may also be a sliding operation from right to left/left to right/top to bottom/bottom to top, and the sliding operation stays at the end position without interval or is operated for a long preset time period, or the like.
In some embodiments, to avoid collision with an existing single-finger sliding operation, the first operation may be a two-finger sliding operation, a three-finger sliding operation, or the like. Of course, if not conflicting, the first operation may also be a single-finger sliding operation.
In some embodiments, to avoid the inconvenience of the user operation or affecting the first interface currently displayed due to the excessively long sliding path, the sliding path of the first sliding operation may be shorter, for example, the user may only need to slide a distance along the lower edge of the display screen to the upper side of the display screen to bring out the subsequent multi-task window, for example, the distance is less than or equal to one third of the screen, or the distance is more or less equal to the height of the multi-task window to be displayed.
S403: the tablet 100 displays a second interface in response to the first operation.
Wherein the second interface may include the content of the first interface and a multi-tasking window thereon. The multitasking window displays a part of applications, all applications or an application internal interface of a certain application that has been opened on the tablet pc 100. In this embodiment, after the tablet computer 100 detects the first operation of the user on the first interface, the first operation may be responded, and then a second interface may be displayed, where the second interface may include a multi-task window. For example, assuming that the first operation is a sliding operation from the lower edge of the display screen upward, the tablet computer 100 may display a multi-tasking window at the lower edge position of the display screen in response to the sliding operation from the bottom up.
In some embodiments, the content of the original first interface may be moved up or retracted up accordingly, presenting a visual sensation of being squeezed by the multi-tasking window; in other embodiments, the multi-tasking window may be displayed superimposed on the first interface, obscuring portions of the first interface's content; in other embodiments, the multi-tasking window may replace the original portion of the first interface (e.g., the shortcut window area) for display as part of the first interface.
The above process will be described in detail below by taking the first interface as a main interface and the application interface as examples.
Example 1: as shown in fig. 5, it is assumed that the tablet computer 100 displays a main interface 500 shown in fig. 5 (a), and a plurality of application icons 501 and a shortcut window area 502 (also referred to as a DOCK area) may be included in the main interface 500. The DOCK area is used to store commonly used application icons, and illustratively, the DOCK area is not typically switched when switching desktops. Next, the user may perform a gesture operation in the main interface 500, such as a sliding operation of two fingers from bottom to top along the lower edge of the display screen, and the tablet computer 100 may display a multi-task window on the main interface, such as the interface 510 shown in (b) of fig. 5, in response to the sliding operation. A plurality of application icons 501 and a multi-tasking window 511 may be included in the interface 510. The multi-task window 511 may replace the shortcut window area 502, may be displayed in a floating manner on an upper layer of the shortcut window area 502, or may be displayed in other forms, which is not limited in this application.
Example 2: as shown in fig. 6, assume that a user is using a video APP, such as tablet 100 displays interface 600 shown in fig. 6 (a), and that interface 600 is an application interface of the video APP. The tablet 100 may detect a gesture operation performed by the user on the interface 600, for example, a sliding operation of two fingers from bottom to top along a lower edge of the display screen, and then the tablet 100 may display a multi-tasking window on an application interface of the video APP in response to the sliding operation, for example, the tablet 100 may display the interface 610 shown in (b) of fig. 6. Among other things, the interface 610 may include the application interface 600 of the video APP and a multitasking window 611. Also, a multitasking window 611 may be displayed under the application interface 600 of the video APP, the multitasking window 611 obscuring part of the content of the application interface 600. In some embodiments, the video is playing while shown in fig. 6 (a), and then the video is not paused while the interface shown in fig. 6 (b) is displayed. That is, after the user triggers the multi-tasking window on the interface 600, the current video is not interrupted, i.e., the multi-tasking window can be displayed while the current video is being played.
It should be understood that the multitasking window 511 in the schematic diagram shown in fig. 5 and the multitasking window 611 in the schematic diagram shown in fig. 6 may include application windows corresponding to a plurality of APPs that the tablet computer 100 is currently running. In some embodiments, the application window is not an application icon, but a thumbnail of an application running interface.
In other embodiments, the application interface 600 shown in fig. 6 may be an interface for a full screen display of an application such as a video or game, i.e., the video or game occupies the entire screen and the status bar is hidden.
Through the above embodiment, after the tablet computer 100 detects the operation of triggering the multi-task window, the multi-task window can be displayed on the currently displayed interface in response to the operation of triggering the multi-task window, that is, the current task interface and the multi-task interface are displayed simultaneously, which solves the problem that in the prior art, the current task is interrupted (blocked or suspended, etc.) due to triggering of displaying the multi-task interface, and can improve the user experience.
In some embodiments, when the first interface is an application interface, the multiple windows of the application may be included in the multi-tasking window. That is, the multi-tasking window may display multiple APPs currently running in the tablet 100, as well as different windows of the application. The relevant content of a plurality of windows including the application in a multitasking window is described below in connection with specific examples.
Example 1: as shown in fig. 7A, assuming that the tablet computer 100 displays an application interface of the shopping APP, such as the interface 700 shown in (a) of fig. 7A, when the tablet computer 100 detects a gesture operation of the user on the interface 700, such as a sliding operation shown in the drawing, the tablet computer 100 may respond to the sliding operation, for example, display the interface 710 shown in (b) of fig. 7A. Interface 700 and a multi-tasking window 711 may be included in the interface 710. The multitasking window 711 may include a plurality of interfaces, such as interface 1, interface 2, and interface 3, that are displayed before or after the user opens the shopping APP and uses the shopping APP.
In some embodiments, the multiple interfaces may be interfaces that the user uses the shopping APP after opening the shopping APP this time, where the shopping APP is sequentially displayed, or may also be interfaces that the tablet pc 100 screens according to some set rules (for example, according to a stay time of the user's eyes on the interface, etc. and according to a current scene recommendation), which is not limited in this application.
Example 2: as shown in fig. 7B, assuming that the tablet 100 displays an application interface of the social APP, such as the interface 740 shown in (a) of fig. 7B, when the tablet 100 detects a gesture operation of the user on the interface 740, such as a sliding operation shown in the drawing, the tablet 100 may respond to the sliding operation, for example, display the interface 750 shown in (B) of fig. 7B. Interface 750 includes interface 740 and multitasking window 751. Among these, the multitasking window 751 may include a chat window (such as chat window 1 with king, chat window 2 with xiao Li) for the logged-in user of the social APP to chat with other friends, a subscription number message window such as window 3, and a circle of friends window such as window 4.
Further, after displaying the multi-tasking window on the first interface, the user may continue to operate on the multi-tasking window. Illustratively, after step S403, the method further includes the steps of:
s404: the tablet 100 detects a second operation on the first window on the second interface.
For convenience of description, in the embodiment of the present application, one window in the multitasking window may be named as a "first window", and the first window may be a window that performs long-press operation and drag for a user.
The second operation may be a gesture operation of the user on the multitasking window of the second interface, for example, the gesture operation may be a long-press operation with two fingers. In some embodiments, the long press operation may be an operation to stay on a first window of the multi-tasking windows or long press for a preset period of time. It should be understood that the multitasking window may include multiple windows, which may be application windows of different APPs, or may be different windows of the same APP (see fig. 7A or fig. 7B for the foregoing examples).
In other embodiments, the second operation may be directed to a space of the multi-tasking window instead of a window, for example, the second operation may be a long press operation in the space of the multi-tasking window.
S405: the tablet 100 responds to the second operation such that the first window is in an editing state.
Example 1: taking fig. 5 as an example, the tablet pc 100 displays a second interface, i.e., the interface 510 shown in (b) of fig. 5, in response to the first operation. Then, the user may perform a gesture operation on the multitasking window 511, such as a two-finger long press as shown in the figure is a window where the application interface of the video APP is located (i.e., the first window), and then the tablet pc 100 may respond to the long press operation, so that the window where the application interface of the video APP is located is in an editing state. In some embodiments, the edit status means that the window may be dragged, moved, deleted, etc., for example, displaying interface 520 shown in fig. 5 (c). In some embodiments, the window of editing status may be displayed with an indication, such as a cross indication in fig. 5 (c), or may be indicated by other means, such as zooming in or out, shading, highlighting, stereoscopically, and/or audible prompts, etc.
Example 2: taking fig. 6 as an example, the second operation may be a long press operation performed by the user on the window where the application interface of the gallery APP is located in the interface 610, for example, the long press operation in the schematic diagram shown in fig. 6 (b), and the tablet computer 100 may detect the long press operation, and then respond to the long press operation, so that the window where the application interface of the gallery APP is located is in an editing state, for example, the interface 620 shown in fig. 6 (c) is displayed.
Example 3: taking fig. 7A as an example, assuming that a user performs a long-press operation on a window corresponding to interface 1 of the shopping APP on interface 710 shown in fig. 7A (b), tablet computer 100 may detect the long-press operation on the window, and then in response to the long-press operation, tablet computer 100 may display interface 720 shown in fig. 7A (c), where the window corresponding to interface 1 of the shopping APP is in an edited state.
In other embodiments, when the second operation is not directed to a window, if the tablet computer 100 detects the second operation, the second operation may be responded, and all windows in the multi-tasked window may be in an editing state.
S406: the tablet 100 detects a third operation on the first window.
The third operation may be a drag operation on the first window.
S407: the tablet computer 100 displays the first interface and the first window in a split screen in response to the third operation.
In some embodiments, after the tablet computer 100 detects the third operation, the first window in the multi-task window and the first interface may be displayed in a split screen manner in response to the third operation. For example, taking fig. 5 as an example, after the tablet computer 100 displays the multitasking window 511, the user may drag the window where the application interface, which is the video APP, is located onto the desktop in the drag direction in the interface 520 shown in (d) of fig. 5. The tablet computer 100 may detect the drag operation, and then the tablet computer 100 may respond to the drag operation to display the application interface, which is the video APP, and the main interface in a split screen manner, for example, the tablet computer 100 may display the interface 530 shown in (e) of fig. 5.
As another example, taking fig. 6 as an example, the user may drag the window where the application interface of gallery APP is located onto the application interface of the hua video APP in the drag direction in the interface 620 shown in (d) of fig. 6. The tablet 100 may detect the drag operation, and then the tablet 100 may respond to the drag operation to display the application interface of the gallery APP and the application interface of the hua video APP in a split screen manner, for example, the tablet 100 may display the interface 630 shown in (e) in fig. 6.
For another example, taking the schematic diagram shown in fig. 7A as an example, assume that the user may continue to drag the window corresponding to the interface 1 of the shopping APP on the basis of the interface 720 shown in fig. 7A (c), for example, drag the window corresponding to the interface 1 of the shopping APP according to the drag direction shown in fig. 7A (d), and then the tablet pc 100 may respond to the drag operation to display the interface 1 of the shopping APP and the interface 700 in a split screen manner, for example, the tablet pc 100 may display the interface 730 shown in fig. 7A (e).
In some embodiments, the user may also perform a fourth operation at the multitasking window, which the tablet 100 may detect and respond to, such that the windows at the multitasking window are all in an edited state. Illustratively, as shown in fig. 8, assuming that the tablet 100 displays the interface 800 shown in fig. 8 (a), the interface 800 includes a multi-tasking window 801 therein. When the tablet computer 100 detects a gesture operation of the user on the interface 800, such as a long press operation at the blank position of the multitasking window 801 shown in the drawing, the tablet computer 100 may respond to the long press operation so that each of the multitasking windows 801 is in an edited state, such as the interface 810 shown in (b) of fig. 8. Then, the user may drag at least one window of the multitasking windows in the interface 810, and the tablet computer 100 may split the dragged window and the main interface in response to the drag operation of the user.
It should be understood that the number of interfaces of the split screen may be two or three or more in the embodiment of the present application, which is not limited in this application. For example, after the first split in the foregoing embodiment, the multitasking window may be recalled, and then the foregoing procedure is executed to perform the second split, so that a three-split situation occurs; alternatively, reference may be made to the previous embodiment, but two or more windows are dragged from the multi-tasked window, three or more split screens are implemented, and the dragging may be a one-time dragging or a multi-time dragging.
By the embodiment, the tasks in the multi-task window and the desktop/current tasks can be combined into the screen, so that the screen splitting efficiency is improved, and the user experience is improved.
In some embodiments, the user may select a task to switch by sliding a finger left and right over the multitasking window. In some embodiments, when a user's finger is slid to a window, such as a second window, the second window may be enlarged for display; when the finger of the user is released from the second window, the second window can be displayed in full screen, namely, the current interface is switched to the interface corresponding to the second window.
In other embodiments, when the finger of the user is released from the second window, reference may be made to the foregoing embodiments, where the second window may be displayed on a split screen with the currently displayed application interface, and the specific illustration is similar to the foregoing, but the operation manner is slightly different and will not be repeated.
Example 1: as shown in fig. 9, it is assumed that the tablet 100 displays an interface 900 shown in fig. 9 (a), and the interface 900 may include a multitasking window 901 and a plurality of application icons 902 on a main interface. When the finger of the user slides left and right on the multitasking window 901, for example, the finger slides to the window corresponding to the setup application interface, the window corresponding to the setup application interface may be displayed in an enlarged manner, for example, refer to the interface 910 shown in fig. 9 (b). Next, after the finger of the user is released from the window corresponding to the setting application interface, the tablet computer 100 may detect the loosening operation, and in response to the loosening operation, the setting application interface is displayed in full screen, for example, the tablet computer 100 may display the interface 920 shown in fig. 9 (c). Setting the full screen display of the application interface means that the set application interface is full of the whole screen, and the status bar can be displayed or not.
Example 2: as shown in fig. 10, it is assumed that the tablet 100 displays an interface 1000 shown in (a) of fig. 10, and the interface 1000 may include a gallery application interface 1001 and a multi-task window 1002. Wherein the multitasking window 1002 may comprise a window of different pictures that the user opens in the gallery application. When the tablet computer 100 detects a gesture operation, such as a two-finger sliding to the window 1003, the window 1003 may be enlarged and displayed in response to the gesture operation, for example, the tablet computer 100 may display the interface 1010 shown in (b) of fig. 10. Next, when the user's finger is released from the window 1003, the tablet computer 100 may detect the release operation and display the window 1003 full screen in response to the release operation, for example, the tablet computer 100 may display the interface 1020 shown in (c) of fig. 10.
Through the embodiment, the task in the current task and the task in the multi-task window can be switched. Further, for a single application, the efficiency of task switching can be improved.
All or part of the above embodiments provided in the present application may be freely and arbitrarily combined with each other. The combined technical scheme is also within the protection scope of the application.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the point of view that the electronic device is the execution subject. In order to implement the functions in the methods provided in the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, where the functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Some of the functions described above are performed in a hardware configuration, a software module, or a combination of hardware and software modules, depending on the specific application of the solution and design constraints.
As shown in fig. 11, further embodiments of the present application disclose an electronic device, which may be an electronic device having a display screen. Referring to fig. 11, the electronic device 1100 includes: a display 1101; one or more processors 1102; one or more memories 1103; one or more sensors 1104 (not shown), a plurality of applications 1105 (not shown); and one or more computer programs 1106 (not shown), which may be coupled via one or more communication buses 1107.
Wherein the display 1101 is for displaying a display interface of an application in the electronic device. The memory 1103 has stored therein one or more computer programs that, when invoked by the one or more processors 1102 for execution, cause the electronic device 1100 to perform the display methods of the above embodiments.
Illustratively, the instructions, when invoked by the one or more processors 1102 for execution, cause the electronic device 1100 to perform the steps of: the display screen displays a first interface; detecting a first operation on the first interface, wherein the first operation is used for triggering the display of a multi-task window; and in response to the first operation, displaying a second interface, wherein the second interface comprises a first part of the first interface and the multi-task window, and the multi-task window replaces the part, except for the first part, of the first interface, or covers the part, except for the first part, of the first interface.
In the embodiments of the present application, the processor 1102 may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, where the methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. A software module may be located in the memory 1103, and the processor 1102 reads the program instructions in the memory 1103, and in combination with the hardware, performs the steps of the method described above.
In the embodiment of the present application, the memory 1103 may be a nonvolatile memory, such as a hard disk (HDD) or a Solid State Drive (SSD), or may be a volatile memory (RAM). The memory may also be any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory in the embodiments of the present application may also be a circuit or any other device capable of implementing a memory function, for storing instructions and/or data.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and units described above may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
Based on the above embodiments, the present application also provides a computer storage medium in which a computer program is stored, which when executed by a computer, causes the computer to execute the display method provided in the above embodiments.
Also provided in embodiments of the present application is a computer program product comprising instructions that, when run on a computer, cause the computer to perform the display method provided in the above embodiments.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by instructions. These instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Claims (15)
1. A display method, comprising:
the electronic equipment displays a first interface;
the electronic equipment detects a first operation on the first interface, wherein the first operation is used for triggering the display of a multi-task window;
the electronic equipment responds to the first operation, and displays a second interface, wherein the second interface comprises a first part of the first interface and the multitasking window, the multitasking window replaces the part of the first interface except the first part, or the multitasking window covers the part of the first interface except the first part.
2. The method of claim 1, wherein when a video being played is displayed on the first interface, the video is not paused after the second interface is displayed.
3. The method as recited in claim 2, further comprising: and in response to the operation of the user on the video, pausing playing the video.
4. The method of claim 1, wherein when the first interface is a desktop, after displaying the second interface, the method further comprises:
and responding to the operation of the user on any application icon on the desktop, and opening the application corresponding to the application icon.
5. The method of any one of claims 1-4, wherein the method further comprises:
the electronic device detecting a second operation on a first window of the multi-tasking windows;
and the electronic equipment responds to the second operation and triggers the first window to be in an editing state.
6. The method of any one of claims 1-5, wherein the method further comprises:
the electronic device detecting a third operation on a first window of the multi-tasking windows;
and responding to the third operation by the electronic equipment, and carrying out split-screen display on the interface corresponding to the first window and the first interface.
7. The method of any one of claims 1-6, wherein the method further comprises:
The electronic device detecting a fourth operation on a first window of the multi-tasking windows;
the electronic equipment responds to the fourth operation and displays the first window in an enlarged mode;
the electronic device detecting a fifth operation on the first window;
and the electronic equipment responds to the fifth operation and displays a third interface, wherein the third interface is the interface corresponding to the first window.
8. The method of claim 1, wherein the first operation is an operation of sliding up a lower edge of the display screen a set distance.
9. The method of any of claims 1-8, wherein the multitasking window comprises a window of a plurality of applications currently running by the electronic device or a different window of a first application, the first application being an application displayed in the first interface.
10. The method of claim 5, wherein the second operation is a long press operation on the first window.
11. The method of claim 6, wherein the third operation is a drag operation on the first window.
12. The method of claim 7, wherein the fourth operation is a two-finger sliding and resting operation on the first window, and the fifth operation is a two-finger releasing operation.
13. An electronic device, wherein the electronic device comprises a display screen; one or more processors; one or more memories; one or more sensors; a plurality of applications; and one or more computer programs;
wherein the one or more computer programs are stored in the one or more memories, the one or more computer programs comprising instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-12.
14. A computer readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 12.
15. A computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210990769.7A CN117632323A (en) | 2022-08-18 | 2022-08-18 | Display method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210990769.7A CN117632323A (en) | 2022-08-18 | 2022-08-18 | Display method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117632323A true CN117632323A (en) | 2024-03-01 |
Family
ID=90036420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210990769.7A Pending CN117632323A (en) | 2022-08-18 | 2022-08-18 | Display method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117632323A (en) |
-
2022
- 2022-08-18 CN CN202210990769.7A patent/CN117632323A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11450322B2 (en) | Speech control method and electronic device | |
CN114816210B (en) | Full screen display method and device of mobile terminal | |
WO2020224485A1 (en) | Screen capture method and electronic device | |
WO2021000881A1 (en) | Screen splitting method and electronic device | |
CN115866121B (en) | Application interface interaction method, electronic device and computer readable storage medium | |
US20220229623A1 (en) | Display method and electronic device | |
CN111543042B (en) | Notification message processing method and electronic equipment | |
CN111176506A (en) | Screen display method and electronic equipment | |
CN114397981A (en) | Application display method and electronic equipment | |
WO2021036770A1 (en) | Split-screen processing method and terminal device | |
CN115297200A (en) | Touch method of equipment with folding screen and folding screen equipment | |
CN114363462B (en) | Interface display method, electronic equipment and computer readable medium | |
EP3958106B1 (en) | Interface display method and electronic device | |
US20220358089A1 (en) | Learning-Based Keyword Search Method and Electronic Device | |
US20220291832A1 (en) | Screen Display Method and Electronic Device | |
CN114527901A (en) | File dragging method and electronic equipment | |
CN113438366A (en) | Information notification interaction method, electronic device and storage medium | |
CN117215446B (en) | Display method and electronic equipment | |
EP4180918A1 (en) | Method for moving control and electronic device | |
CN117632323A (en) | Display method and electronic equipment | |
CN114244951B (en) | Method for opening page by application program, medium and electronic equipment thereof | |
CN118433302A (en) | Element display method and electronic equipment | |
CN118092759A (en) | Display method of application interface and electronic equipment | |
CN116992829A (en) | Text editing method and electronic equipment | |
CN117687707A (en) | Application hot start method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |