CN110661917A - Display method and electronic equipment - Google Patents

Display method and electronic equipment Download PDF

Info

Publication number
CN110661917A
CN110661917A CN201910726163.0A CN201910726163A CN110661917A CN 110661917 A CN110661917 A CN 110661917A CN 201910726163 A CN201910726163 A CN 201910726163A CN 110661917 A CN110661917 A CN 110661917A
Authority
CN
China
Prior art keywords
display area
interface
display
screen
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910726163.0A
Other languages
Chinese (zh)
Other versions
CN110661917B (en
Inventor
关驰
徐吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910726163.0A priority Critical patent/CN110661917B/en
Priority to CN202210347479.0A priority patent/CN114840127A/en
Publication of CN110661917A publication Critical patent/CN110661917A/en
Priority to PCT/CN2020/103877 priority patent/WO2021023021A1/en
Application granted granted Critical
Publication of CN110661917B publication Critical patent/CN110661917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display method and electronic equipment, wherein the method is applied to the electronic equipment and is used for solving the problem that a window of an existing in-application split-screen interface cannot be adjusted, and the method comprises the following steps: the electronic equipment receives screen splitting operation of a user, responds to the screen splitting operation, controls a first display area of a display screen of the electronic equipment to display a first interface, and controls a second display area of the display screen to display a second interface; the second display area is not overlapped with the first display area or is partially overlapped with the first display area; the first interface and the second interface are different interfaces of a first application, and then a second operation of a user is received, wherein the second operation is used for triggering and adjusting the size of at least one display area in the first display area and the second display area; in response to a second operation, controlling the first display area to be adjusted to a third display area and controlling the second display area to be adjusted to a fourth display area; the third display area displays the first interface, and the fourth display area displays the second interface.

Description

Display method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a display method and an electronic device.
Background
With the continuous development of electronic devices, more and more electronic devices with display screens are widely used in daily life and work of people, such as mobile phones with display screens. Moreover, as the screen technology is developed, the display screen of the electronic device is also getting larger and larger, so that richer information is provided for the user, and better use experience is brought to the user.
At present, part of wide-screen intelligent devices support a multi-window mode, wherein the multi-window mode refers to the situation that different interfaces in an application can be respectively displayed on different windows of the same display screen, but the windows of the different interfaces in the multi-window mode do not support the adjustment of the size of the windows, and the use experience of a user is influenced.
Disclosure of Invention
The application provides a display method and electronic equipment, which are used for solving the problem that a window of an existing in-application split screen interface cannot be adjusted.
In a first aspect, an embodiment of the present application provides a display method, where the method is applied to an electronic device, and the method includes: the electronic equipment receives screen splitting operation of a user, responds to the screen splitting operation, controls a first display area of a display screen of the electronic equipment to display a first interface, and controls a second display area of the display screen to display a second interface; the second display area is not overlapped with the first display area or is partially overlapped with the first display area; the first interface and the second interface are different interfaces of a first application, and then a second operation of a user is received, wherein the second operation is used for triggering and adjusting the size of at least one display area in the first display area and the second display area; in response to a second operation, controlling the first display area to be adjusted to a third display area and controlling the second display area to be adjusted to a fourth display area; the third display area displays the first interface, and the fourth display area displays the second interface.
In the embodiment of the application, the method can be used for adjusting the windows of different interfaces in the application, and the use experience of a user can be improved.
In one possible design, the right boundary of the first interface and the left boundary of the second interface may be provided with a window adjustment control; the second operation is a dragging operation acting on the window adjusting control, and the dragging operation comprises a dragging direction, an operation starting point coordinate and an operation end point coordinate. Before the electronic device controls the first display area to be adjusted into the third display area and controls the second display area to be adjusted into the fourth display area, the method further includes: adjusting the ratio of the width of the first display area to the width of the second display area according to the dragging direction, the operation starting point coordinate and the operation end point coordinate; and determining a third display area and a fourth display area according to the ratio.
In the embodiment of the application, the method can enrich the display form of the display screen, improve the utilization rate of the display screen and improve the visual experience of a user.
In one possible design, when the ratio between the width of the first display area and the width of the second display area is smaller than a first threshold, the second interface can be controlled to be displayed in a full screen mode; or when the ratio of the width of the first display area to the width of the second display area is greater than a second threshold, the first interface can be controlled to be displayed in a full screen mode; wherein the first threshold is much smaller than the second threshold.
In the embodiment of the application, the method can enable the display screen to be switched from the multi-window mode to the single-window mode, and the display mode is more flexible and changeable.
In one possible design, the second interface may be an interface of a previous level of the first interface; alternatively, the second interface may be an interface of a next hierarchy of the first interface.
In one possible design, the first display area and the second display area may be pre-configured in the electronic device; alternatively, the first display area and the second display area may be set in the electronic device by a user according to a requirement. Or, when the display screen of the electronic device is a foldable screen, the first display area may be a display area corresponding to the first screen, and the second display area may be a display area corresponding to the second screen, where the foldable screen may be folded to form at least two screens, where the at least two screens include the first screen and the second screen. In addition, the window adjustment control described above may be disposed at a folding area of the folding screen.
In the embodiment of the application, the display content of the folding screen can be richer, the utilization rate of the folding screen can be improved, and the visual experience of a user is improved.
In a second aspect, embodiments of the present application provide an electronic device, including a sensor, a touch screen, a processor, and a memory, where the memory is configured to store one or more computer programs; the one or more computer programs stored in the memory, when executed by the processor, enable the electronic device to implement any of the possible design methodologies of any of the aspects described above.
In a third aspect, the present application further provides an apparatus including a module/unit for performing the method of any one of the possible designs of any one of the above aspects. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a fourth aspect, this embodiment also provides a computer-readable storage medium, which includes a computer program and when the computer program runs on an electronic device, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
In a fifth aspect, the present application further provides a computer program product, which when run on a terminal, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
Fig. 1 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an android operating system provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a split-screen mode according to an embodiment of the present application;
fig. 4A is a schematic view of another split-screen mode provided in the embodiment of the present application;
fig. 4B is a schematic view of another split-screen mode provided in the embodiment of the present application;
fig. 5 is an interaction diagram of another software implementation method in a split screen mode according to an embodiment of the present application;
fig. 6 is a schematic diagram of a split screen operation provided in an embodiment of the present application;
fig. 7A to fig. 7E are schematic interface diagrams of a split screen display method according to an embodiment of the present application;
fig. 8A to 8D are schematic views of another split-screen display interface provided in the embodiment of the present application;
fig. 9 is a schematic view of an example of a setting interface of a display area according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of an example of a setting interface of a display area according to an embodiment of the present application;
fig. 11 is a schematic view of an example of a setting interface of a display area according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The embodiment of the application can be applied to any electronic equipment with a touch screen, and the electronic equipment can be portable equipment, such as a mobile phone, a tablet computer, wearable equipment (such as a smart watch) with a wireless communication function, and the like. The portable terminal has a touch screen and algorithm operation capability (can operate the display method provided by the embodiment of the application). Exemplary embodiments of the portable device include, but are not limited to, a mount
Figure BDA0002159000930000031
Or other operating system. The portable device may be other portable devices as long as the portable device has a touch screen and arithmetic operation capability (can operate the display method provided by the embodiment of the present application). It should also be understood that in some other embodiments of the present application, the electronic device may not be a portable device, but may be a desktop computer having a touch screen and arithmetic operation capability (capable of running the display method provided by the embodiments of the present application).
Taking the electronic device as an example of a mobile phone, fig. 1 shows a schematic structural diagram of a mobile phone 100.
The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 151, a wireless communication module 152, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a SIM card interface 195, and the like. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, and a rotation axis sensor 180M (of course, the mobile phone 100 may further include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like, which are not shown in the figure).
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be a neural center and a command center of the cell phone 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The processor 110 may run the display method provided in the embodiment of the present application, so that a user can adjust the size of windows of different interfaces of the same application displayed on the display screen, thereby improving the user experience. When the processor 110 may integrate different devices, such as a CPU and a GPU, the CPU and the GPU may cooperate to execute the display method of the touch screen provided in the embodiment of the present application, for example, part of algorithms in the display method of the touch screen is executed by the CPU, and another part of algorithms is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In this embodiment, the display screen 194 may be an integrated flexible display screen, or may be a spliced display screen formed by two rigid screens and one flexible screen located between the two rigid screens. After the processor 110 executes the display method provided by the embodiment of the present application, the processor 110 may control the window sizes of different interfaces of the same application on the display screen 194.
The cameras 193 (front camera or rear camera, or one camera may be both front camera and rear camera) are used to capture still images or video. In general, the camera 193 may include a photosensitive element such as a lens group including a plurality of lenses (convex lenses or concave lenses) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, codes of application programs (such as a camera application, a WeChat application, etc.), and the like. The data storage area can store data created during the use of the mobile phone 100 (such as images, videos and the like acquired by a camera application), and the like.
The internal memory 121 may also store codes of a display area adjustment algorithm provided by an embodiment of the present application. When the code of the display area adjustment algorithm stored in the internal memory 121 is executed by the processor 110, the processor 110 may control the display position of the message in the notification bar on the display screen 194.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the code of the display area adjustment algorithm provided in the embodiment of the present application may also be stored in the external memory. In this case, the processor 110 may run the code of the display region adjustment algorithm stored in the external memory through the external memory interface 120, and the processor 110 may control the window sizes of different interfaces of the same application on the display screen 194.
The function of the sensor module 180 is described below.
The gyro sensor 180A may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handpiece 100 about three axes (i.e., the x, y, and z axes) may be determined by the gyro sensor 180A. I.e., the gyro sensor 180A may be used to detect the current state of motion of the handset 100, such as shaking or standing still.
The gyro sensor 180A in the embodiment of the present application is used to detect a folding or unfolding operation applied to the display screen 194. The gyro sensor 180A may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folded state or unfolded state of the display screen 194.
The acceleration sensor 180B can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). I.e., the gyro sensor 180A may be used to detect the current state of motion of the handset 100, such as shaking or standing still. The acceleration sensor 180B in the present embodiment is used to detect a folding or unfolding operation applied to the display screen 194. The acceleration sensor 180B may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folded state or unfolded state of the display screen 194.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. The proximity light sensor 180G may be disposed on a first screen of the foldable display screen 194, and the proximity light sensor 180G detects a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 180A (or the acceleration sensor 180B) may transmit the detected motion state information (such as an angular velocity) to the processor 110. The processor 110 determines whether the mobile phone is currently in the hand-held state or the tripod state (for example, when the angular velocity is not 0, it indicates that the mobile phone 100 is in the hand-held state) based on the motion state information.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, different from the position of the display 194.
Illustratively, the display screen 194 of the handset 100 displays a main interface that includes icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks the icon of the camera application in the home interface through the touch sensor 180K, which triggers the processor 110 to start the camera application and open the camera 193. The display screen 194 displays an interface, such as a viewfinder interface, for the camera application. In the present embodiment, when the display screen 194 receives a drag acting on the window adjustment control, the processor 110 may control the window size of different interfaces of the same application on the display screen 194.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 151, the wireless communication module 152, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 151 may provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 100. The mobile communication module 151 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 151 may receive electromagnetic waves from the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 151 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 151 may be provided in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 151 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 151 or other functional modules, independent of the processor 110.
The wireless communication module 152 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 152 may be one or more devices integrating at least one communication processing module. The wireless communication module 152 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 152 may also receive a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it.
In addition, the mobile phone 100 can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. The handset 100 may receive key 190 inputs, generating key signal inputs relating to user settings and function controls of the handset 100. The handset 100 can generate a vibration alert (e.g., an incoming call vibration alert) using the motor 191. The indicator 192 in the mobile phone 100 may be an indicator light, and may be used to indicate a charging status, a power change, or a message, a missed call, a notification, etc. The SIM card interface 195 in the handset 100 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195.
It should be understood that in practical applications, the mobile phone 100 may include more or less components than those shown in fig. 1, and the embodiment of the present application is not limited thereto.
Fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application. The layered architecture can divide the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into three layers, which are an application layer (referred to as an application layer), an application framework layer (referred to as a framework layer), and a kernel layer (also referred to as a driver layer) from top to bottom.
Wherein the application layer may comprise a series of application packages. As shown in fig. 2, the application layer may include a plurality of application packages such as application 1 and application 2. For example, the application package may be a camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, and desktop Launcher (Launcher) application.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 2, the framework layer may include a Window Manager (WMS), an Activity Manager (AMS), and the like. Optionally, the framework layer may further include a content provider, a view system, a telephony manager, an explorer, a notification manager, etc. (not shown in the drawings).
Among them, the window manager WMS is used to manage the window program. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The Activity manager AMS is used for managing Activity and is used for starting, switching and scheduling each component in the system, managing and scheduling application programs and the like.
The kernel layer is a layer between hardware and software. The kernel layer contains at least display drivers, camera drivers, audio drivers, sensor drivers, input/output device drivers (e.g., keyboard, touch screen, headphones, speakers, microphones, etc.), and the like.
The electronic device receives an input operation (such as a split screen operation) acted on the display screen by a user, and the kernel layer can generate a corresponding input event according to the input operation and report the event to the application framework layer. A window mode (e.g., a multi-window mode) corresponding to the input operation, a window position and size, and the like are set by the activity management server AMS of the application framework layer. And the window management server WMS of the application framework layer draws a window according to the setting of the AMS, then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
The display method provided by the embodiment of the application can be realized based on the free window (freeform) characteristic of google and a multi-window multi-task infrastructure. The display method provided by the embodiment of the application can be seen in the following fig. 3. As shown in fig. 2, in an embodiment of the present application, the Activity manager AMS may include an Activity native management module and an Activity extension module. The Activity native management module is used for managing Activity, and is responsible for starting, switching and scheduling each component in the system, managing and scheduling application programs and the like. The Activity expansion module is used for setting a window mode and the property of the window according to the folding state or the unfolding state of the folding screen.
The properties of the window may include, among other things, the position and size of the Activity window, and the visible properties of the Activity window (i.e., the state of the Activity window). The position of the Activity window is the position of the Activity window on the folding screen when the folding screen displays the Activity window, and the size of the Activity window can be high-level information in the application starting config. The visible property of the Activity window may be true or false. When the visible property of the Activity window is true, it indicates that the Activity window is visible to the user, i.e., the display driver will display the content of the Activity window. When the visible property of the Activity window is false, it indicates that the Activity window is invisible to the user, i.e., the display driver does not display the content of the Activity window.
Illustratively, an application (e.g., application 1) may invoke a launch Activity interface to launch a corresponding Activity. The Activity manager AMS may request the window manager WMS to draw a window corresponding to the Activity in response to the application call, and call a display driver to implement display of the interface.
It is understood that, in the process of displaying the application interface by the electronic device, the input/output device driver of the driver layer may detect a drag event (onTouchEvent) of the user. The input/output device driver may report the drag event to a window manager WMS of a framework layer (i.e., an application framework layer). After the window manager WMS monitors the drag event, the window manager WMS sends a display change event to the activity manager AMS, that is, an aid interface (interface definition language) interface sends an operation endpoint coordinate in the drag event from the application process to the AMS of an Android Open-Source Project (AOSP) process, and the AMS adjusts the widths of the two windows according to the operation endpoint coordinate. The activity manager AMS sets a window mode and a property of a window, for example, determines a display area (display) size. After the Activity manager AMS sets the Activity window mode and the property, the Activity manager AMS can request the window manager WMS to redraw the window, and then the display driver is called to display the redrawn window content, so that different interfaces of the application are displayed to the user in different new display areas.
At present, existing terminal devices (e.g., mobile phones, tablet devices, etc.) generally have a split screen function, where the split screen function is to divide a physical screen of a terminal into a plurality of display windows for display, and each display window can display different interfaces, so as to facilitate users to use the terminal devices. Specifically, window adjustment in the existing split screen mode is implemented by means of a corresponding split screen application or a System UI (System interface) process, but in the process of implementing split screen by the foregoing manner, a System Server (System service) process needs to be notified by an aid cross-process communication mechanism to implement processing of window adjustment operation, so that processing delay is long and a screen may not be smooth enough.
Illustratively, the terminal device shown in fig. 3 supports a split screen function, and when a user performs split screen display on the gallery application 31 and the wechat application 32 in the terminal device shown in fig. 3, the process of the gallery application 31 and the process of the wechat application 32 need to be started first, and then the split screen operation of the user is received to implement determination of sizes of two windows, so as to implement split screen display. If the terminal device receives the moving operation of the window adjusting control 33 by the user, the gallery application 31 process and the WeChat application 32 process of the terminal need to send the coordinates of the moving operation to the System Server process for processing, and because the process of the gallery application 31 and the process of the WeChat application 32 do not belong to the same process as the System Server process, and the communication between the processes can be realized only through a corresponding cross-process communication mechanism, the corresponding cross-process communication mechanism needs to be called to send the moving operation (the window adjusting control is moved by the coordinates) of the user to the System Server process for processing, so that the window adjusting control 33 is moved to the corresponding position indicated by the coordinates, and the size adjustment of the split-screen window is realized. After the System Server process adjusts the window size, the System UI process needs to be triggered to redraw the interface of each window. Since the System UI process and the System Server process also belong to different processes, a cross-process communication mechanism needs to be invoked to redraw the interface of the gallery application 31 and the interface of the wechat application 32.
Obviously, the prior art can affect the response speed and the fluency of window adjustment through the invocation of the cross-process communication mechanism. To this end, referring to fig. 4A, in the embodiment of the present application, in the multi-window mode, before rendering a different window, the AMS may insert a window adjustment control 41 at the right interface of the left Activity root layout of the display screen, similarly to inserting the same window adjustment control 41 at the left interface of the right Activity root layout. It should be noted that the window adjustment control 41 may also be a virtual control, that is, the window adjustment control 41 is invisible to a user, or in a scene where the display screen displays the left and right windows, a boundary connecting the middle of the left and right windows is shared by the left and right windows. The embodiment of the present application does not specifically limit the specific representation form of the window adjustment control 41.
As shown in fig. 4B, since the window adjustment control 41 reloads the drag event (onTouchEvent), in step 401, the user may apply a drag operation on the display screen; step 402, the kernel layer monitors the dragging event, and sends the dragging direction, the operation start coordinate and the operation end coordinate from the application process to the AMS module of the AOSP process through the AIDL interface; step 403, the AMS module calculates the widths of the left and right windows according to the operation endpoint coordinates, and sets an Activity window mode and an attribute; step 404, the window manager WMS is then requested to redraw the window according to the set window attributes. In the whole window adjusting process, the application of the application layer can realize the window adjusting function without adaptive development.
Based on the window adjustment implementation mechanism, an embodiment of the present application provides a display method, and a technical solution in the embodiment of the present application will be described below with reference to the drawings in the embodiment of the present application. In the description of the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified. As shown in fig. 5, the display method provided by the embodiment of the present application may be executed by an electronic device.
In step 501, a processor 110 in an electronic device receives a screen splitting operation of a user.
The split screen operation is used to trigger different interfaces of the first application to be displayed on the same display screen 194 at the same time. The split screen operation may be a split screen gesture or a voice command, etc. as shown in fig. 6.
For example, the standby interface 700 of the mobile phone shown in fig. 7A is provided with a hua shi mall application 701, and when the mobile phone detects a click operation performed by the user on the hua shi mall application 701, the interface 710 shown in fig. 7B is displayed. The user enters the keyword "hat" in the search box of the interface 710, and when the cell phone detects a user operation on the search control 711 of the interface 710, the interface 720 shown in fig. 7C is displayed. The user may apply a split screen gesture at interface 720 as shown in FIG. 6.
In step 502, in response to the screen splitting operation, the processor 110 controls a first display area of the display screen 194 of the electronic device to display a first interface and controls a second display area of the display screen to display a second interface.
Wherein the second display area and the first display area may be partially overlapped or not overlapped at all; the first interface and the second interface are both different interfaces of the first application.
Illustratively, when the cell phone detects a split screen gesture applied by the user on the interface 720 shown in FIG. 7C, the electronic device displays the interface 730 as shown in FIG. 7D. The first interface displayed in the first display area 731 of the interface 730 is the content of the interface 710 shown in fig. 7B, and the second interface displayed in the second display area 732 of the interface 730 is the content of the interface 720 shown in fig. 7C.
At step 503, the processor 110 in the electronic device receives a second operation of the user.
And the second operation is used for triggering the adjustment of the size of at least one of the first display area and the second display area. The second operation may be a drag operation acting on the window adjustment control, or a drag gesture, etc.
In step 504, the processor 110 controls the first display area to be adjusted to a third display area and controls the second display area to be adjusted to a fourth display area in response to the second operation, wherein the third display area displays the first interface and the fourth display area displays the second interface.
In the embodiment of the application, the electronic equipment adjusts the control through increasing the window so that the user can conveniently adjust the size of the window, the effect of adjusting the size of the window in a system level is achieved, the processing time delay is reduced, the fluency of interface display is increased, and meanwhile the use experience of the user is also improved.
Illustratively, when the cell phone displays the interface 730 shown in fig. 7D, and the cell phone can receive a rightward drag operation by the user on the window adjustment control 733, the cell phone displays the interface 740 shown in fig. 7E. The first interface displayed in the fourth display area 741 of the interface 740 is the content of the interface 710 shown in fig. 7B, and the second interface displayed in the third display area 742 of the interface 740 is the content of the interface 720 shown in fig. 7C.
In one possible embodiment, the third-party application may also configure which interfaces are capable of supporting the split-screen display by configuring static configuration parameters so that the third-party application is unaware of the split-screen when running on the electronic device. When the electronic equipment runs the third-party application supporting split-screen display, the split-screen operation of the user can be responded, and the AMS on the operating system can perform split-screen display on the interface of the third-party application according to the static configuration parameters.
In one possible embodiment, the drag operation may include a drag direction, an operation start point coordinate, and an operation end point coordinate; the electronic equipment adjusts the ratio of the width of the first display area to the width of the second display area according to the dragging direction, the operation starting point coordinate and the operation end point coordinate, and determines the third display area and the fourth display area according to the ratio. Illustratively, as shown in fig. 8A, the width of the first display area is X1, the width of the second display area is Y1, X1 is equal to Y1, and when the electronic device detects a drag operation of the user acting on the window adjustment control L1 to the right, the electronic device displays an interface as shown in fig. 8B, the width of the third display area is X2, and the width of the fourth display area is Y2, where X2 is greater than Y2. Further, when the ratio between X2/Y2 is greater than a set threshold (e.g., 19/1), the electronic device may control the first interface in the first display area to be displayed full screen. Further, the electronic device may also hide or remove the window adjustment control.
In some embodiments, as shown in fig. 8C, the width of the first display area is X1, the width of the second display area is Y1, X1 is equal to Y1, and when the cell phone detects a left drag operation performed by the user on the window adjustment control L1, the cell phone displays the interface shown in fig. 8D, the width of the third display area is X3, and the width of the fourth display area is Y3, where X3 is smaller than Y3. Further, when the ratio between X3/Y3 is less than a set threshold (e.g., 1/19), the electronic device may control the second interface in the second display area to be displayed full screen. Further, the electronic device may also hide or remove the window adjustment control.
In some possible embodiments, when the display screen of the electronic device is a folding screen, the boundary L1 between the first display area and the second display area in fig. 8A may coincide with a folding edge of the folding screen. When the foldable screen is completely unfolded, in this embodiment, the first display area corresponds to a first screen of the foldable screen, and the second display area corresponds to a second screen of the foldable screen.
In this embodiment, the second interface may be an interface of a previous level of the first interface; alternatively, the second interface is an interface of a next level of the first interface, the second interface being different from the first interface. In this embodiment, the handset may determine the second interface in the following manner.
First, a display screen (including a folding screen or a non-folding screen) of the electronic device includes a plurality of display areas, which may be preset in the electronic device or may be manually set by a user. That is, the sizes (including the width and the height) of the first display region and the second display region may be previously configured in the mobile phone. Alternatively, the widths and heights of the first display area and the second display area may be manually set by a user in the mobile phone. In this embodiment, the size of the first display region and the size of the second display region may be the same or different.
For example, as shown in fig. 9, a setting control of a display region is displayed on a setting interface 901 of the electronic device, and a user may set an aspect ratio (i.e., an aspect ratio) of a first display region and an aspect ratio of a second display region by himself or the user may select the aspect ratio of the first display region and the aspect ratio of the second display region from the setting interface.
For another example, as shown in (a) of fig. 10, a display area number setting control 1002 and a display area range setting control 1003 in a landscape state are displayed on a setting interface 1001 of the mobile phone. After the user sets the number of display areas to 2 and clicks the control 1003, as shown in (b) of fig. 10, the mobile phone may prompt the user to set the range of the first display area, and the user may drag on the display screen to set the range of the first display area. After the user stops the dragging, as shown in (c) of fig. 10, the cellular phone may display a control 1004 that determines the range of the first display region and a control 1005 that cancels the setting. After the user clicks the control 1004, referring to (d) in fig. 10 and (e) in fig. 10, the mobile phone may determine a range of the first display area and the second display area set by the user, where (d) in fig. 10 is an illustration that the first display area and the second display area are not overlapped at all, and (e) in fig. 10 is an illustration that the first display area and the second display area are overlapped at a part of the area. Optionally, the user may also move the position of the first display region.
For another example, after the user sets the number of display areas on the setting interface 1001 to 2 and clicks the widget 1003, as shown in fig. 11, a boundary 1101 is displayed on the setting interface of the mobile phone, and the user may drag the boundary 1101 to set the size and the aspect ratio of the first display area and the second display area.
Of course, besides the setting manner described above, the user may set the display area included in the display screen of the mobile phone in other manners, which is not limited in the embodiment of the present application.
In other embodiments of the present application, there is also disclosed an electronic device, as shown in fig. 12, which may include: a touch screen 1201, wherein the touch screen 1201 includes a touch panel 1206 and a display screen 1207; one or more processors 1202; a memory 1203; one or more application programs (not shown); and one or more computer programs 1204, which may be connected by one or more communication buses 1205. Wherein the one or more computer programs 1204 are stored in the memory 1203 and configured to be executed by the one or more processors 1202, the one or more computer programs 1204 comprising instructions which may be used to perform the steps as in the corresponding embodiment of fig. 5.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the above related method steps to implement the display method in the above embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the display method in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the display method of the touch screen in the above method embodiments.
In addition, the electronic device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present application are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A display method applied to electronic equipment is characterized by comprising the following steps:
receiving screen splitting operation of a user;
responding to the screen splitting operation, controlling a first display area of a display screen of the electronic equipment to display a first interface, and controlling a second display area of the display screen to display a second interface; wherein the second display area partially overlaps or does not overlap at all with the first display area; the first interface and the second interface are different interfaces of a first application;
receiving a second operation of a user, wherein the second operation is used for triggering and adjusting the size of at least one display area in the first display area and the second display area;
in response to the second operation, controlling the first display area to be adjusted to a third display area and controlling the second display area to be adjusted to a fourth display area; the third display area displays the first interface, and the fourth display area displays the second interface.
2. The method of claim 1, wherein a right boundary of the first interface and a left boundary of the second interface are provided with window adjustment controls; the second operation is a dragging operation acting on the window adjusting control, and the dragging operation comprises a dragging direction, an operation starting point coordinate and an operation end point coordinate;
before controlling the first display area to be adjusted to the third display area and the second display area to be adjusted to the fourth display area, the method further comprises:
adjusting the ratio of the width of the first display area to the width of the second display area according to the dragging direction, the operation starting point coordinate and the operation end point coordinate;
and determining the third display area and the fourth display area according to the ratio.
3. The method of claim 2, further comprising:
when the ratio of the width of the first display area to the width of the second display area is smaller than a first threshold value, controlling the second interface to be displayed in a full screen mode;
or when the ratio of the width of the first display area to the width of the second display area is greater than a second threshold, controlling the first interface to be displayed in a full screen mode;
wherein the first threshold is much smaller than the second threshold.
4. The method of any one of claims 1 to 3, wherein the second interface is an interface of a level above the first interface; alternatively, the second interface is an interface of a next hierarchy of the first interface.
5. The method according to any one of claims 1 to 3, wherein the first display region and the second display region are pre-configured in the electronic device; or the first display area and the second display area are set in the electronic equipment by a user;
or, when the display screen of the electronic device is a folding screen, the first display area is a display area corresponding to the first screen, and the second display area is a display area corresponding to the second screen, wherein the folding screen can be folded to form at least two screens, and the at least two screens include the first screen and the second screen.
6. An electronic device comprising a display screen, a processor, and a memory;
the memory for storing one or more computer programs;
the memory stores one or more computer programs that, when executed by the processor, cause the electronic device to perform:
responding to the screen splitting operation, controlling a first display area of the display screen to display a first interface, and controlling a second display area of the display screen to display a second interface; wherein the second display area partially overlaps or does not overlap at all with the first display area; the first interface and the second interface are different interfaces of a first application;
receiving a second operation of a user, wherein the second operation is used for triggering and adjusting the size of at least one display area in the first display area and the second display area;
in response to the second operation, controlling the first display area to be adjusted to a third display area and controlling the second display area to be adjusted to a fourth display area; the third display area displays the first interface, and the fourth display area displays the second interface.
7. The electronic device of claim 6, wherein the boundary of the first interface and the boundary of the second interface are provided with window adjustment controls; the second operation is a dragging operation acting on the window adjusting control, and the dragging operation comprises a dragging direction, an operation starting point coordinate and an operation end point coordinate;
the memory stores one or more computer programs that, when executed by the processor, cause the electronic device to further perform:
before controlling the first display area to be adjusted into a third display area and the second display area to be adjusted into a fourth display area, adjusting the ratio between the width of the first display area and the width of the second display area according to the dragging direction, the operation starting point coordinate and the operation end point coordinate;
and determining the third display area and the fourth display area according to the ratio.
8. The electronic device of claim 7, wherein the one or more computer programs stored in the memory, when executed by the processor, cause the electronic device to further perform:
when the ratio of the width of the first display area to the width of the second display area is smaller than a first threshold value, controlling the second interface to be displayed in a full screen mode;
or when the ratio of the width of the first display area to the width of the second display area is greater than a second threshold, controlling the first interface to be displayed in a full screen mode;
wherein the first threshold is much smaller than the second threshold.
9. The electronic device according to any one of claims 6 to 8, wherein the second interface is an interface of a previous level of the first interface; alternatively, the second interface is an interface of a next hierarchy of the first interface.
10. The electronic device according to any one of claims 6 to 8, wherein the first display region and the second display region are pre-configured in the electronic device; or the first display area and the second display area are set in the electronic equipment by a user;
or, when the display screen of the electronic device is a folding screen, the first display area is a display area corresponding to the first screen, and the second display area is a display area corresponding to the second screen, wherein the folding screen can be folded to form at least two screens, and the at least two screens include the first screen and the second screen.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the display method according to any one of claims 1 to 5.
12. A chip coupled to a memory for executing a computer program stored in the memory for performing the display method of any one of claims 1 to 5.
CN201910726163.0A 2019-08-07 2019-08-07 Display method and electronic equipment Active CN110661917B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910726163.0A CN110661917B (en) 2019-08-07 2019-08-07 Display method and electronic equipment
CN202210347479.0A CN114840127A (en) 2019-08-07 2019-08-07 Display method and electronic equipment
PCT/CN2020/103877 WO2021023021A1 (en) 2019-08-07 2020-07-23 Display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910726163.0A CN110661917B (en) 2019-08-07 2019-08-07 Display method and electronic equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210347479.0A Division CN114840127A (en) 2019-08-07 2019-08-07 Display method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110661917A true CN110661917A (en) 2020-01-07
CN110661917B CN110661917B (en) 2022-04-12

Family

ID=69036443

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910726163.0A Active CN110661917B (en) 2019-08-07 2019-08-07 Display method and electronic equipment
CN202210347479.0A Pending CN114840127A (en) 2019-08-07 2019-08-07 Display method and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210347479.0A Pending CN114840127A (en) 2019-08-07 2019-08-07 Display method and electronic equipment

Country Status (2)

Country Link
CN (2) CN110661917B (en)
WO (1) WO2021023021A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111274564A (en) * 2020-01-14 2020-06-12 青岛海信移动通信技术股份有限公司 Communication terminal and application unlocking method in split screen mode
WO2021023021A1 (en) * 2019-08-07 2021-02-11 华为技术有限公司 Display method and electronic device
CN112764619A (en) * 2021-01-22 2021-05-07 联想(北京)有限公司 Window control method and electronic equipment
CN112835674A (en) * 2021-03-11 2021-05-25 田继伟 Multi-window display control method, system and storage medium thereof
CN113473013A (en) * 2021-06-30 2021-10-01 展讯通信(天津)有限公司 Display method and device for beautifying effect of image and terminal equipment
CN113805750A (en) * 2021-09-23 2021-12-17 闻泰通讯股份有限公司 Application program display method and device, mobile device and storage medium
CN114020226A (en) * 2021-10-26 2022-02-08 统信软件技术有限公司 Split screen processing method, computing device and readable storage medium
CN114281439A (en) * 2020-09-18 2022-04-05 华为技术有限公司 Screen splitting method and device and electronic equipment
CN114554299A (en) * 2022-01-20 2022-05-27 海信视像科技股份有限公司 Display device and split-screen display method
CN114564101A (en) * 2020-06-19 2022-05-31 华为技术有限公司 Three-dimensional interface control method and terminal
WO2022252786A1 (en) * 2021-05-31 2022-12-08 华为技术有限公司 Window split-screen display method and electronic device
CN115562771A (en) * 2022-01-18 2023-01-03 荣耀终端有限公司 Application window management method, electronic device and computer readable storage medium
CN116048317A (en) * 2023-01-28 2023-05-02 荣耀终端有限公司 Display method and device
CN116048373A (en) * 2022-06-24 2023-05-02 荣耀终端有限公司 Display method of suspension ball control, electronic equipment and storage medium
CN116225287A (en) * 2021-12-03 2023-06-06 荣耀终端有限公司 Display method of application interface and electronic equipment
WO2023226434A1 (en) * 2022-05-23 2023-11-30 Oppo广东移动通信有限公司 Interface display method and apparatus, and device, storage medium and program product
WO2024007966A1 (en) * 2022-07-05 2024-01-11 华为技术有限公司 Multi-window display method and device
WO2024027504A1 (en) * 2022-07-30 2024-02-08 华为技术有限公司 Application display method and electronic device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113722028B (en) * 2021-05-28 2022-10-28 荣耀终端有限公司 Dynamic card display method and device
CN114637570B (en) * 2022-03-25 2024-07-19 京东方科技集团股份有限公司 Boundary adjustment method and device for display interface, storage medium and electronic equipment
CN114866641B (en) * 2022-07-07 2022-11-11 荣耀终端有限公司 Icon processing method, terminal equipment and storage medium
CN115268816A (en) * 2022-08-17 2022-11-01 维沃移动通信有限公司 Split screen control method and device, electronic equipment and readable storage medium
CN117931328A (en) * 2022-10-14 2024-04-26 Oppo广东移动通信有限公司 Interface display method, device, terminal equipment and storage medium
CN117707403B (en) * 2023-07-19 2024-09-24 荣耀终端有限公司 Display method and related device
CN117519864B (en) * 2023-09-19 2024-07-23 荣耀终端有限公司 Interface display method, electronic device and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945115A (en) * 2012-10-18 2013-02-27 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN104793915A (en) * 2015-04-30 2015-07-22 魅族科技(中国)有限公司 Method and terminal for sub-screen display
CN105426097A (en) * 2015-10-30 2016-03-23 努比亚技术有限公司 Real time adjustment method for split screen size and split screen apparatus
CN106062691A (en) * 2014-02-05 2016-10-26 三星电子株式会社 Apparatus and method of displaying windows
CN106101423A (en) * 2016-06-28 2016-11-09 努比亚技术有限公司 Split screen area size adjusting apparatus and method
CN106126065A (en) * 2016-06-29 2016-11-16 努比亚技术有限公司 A kind of mobile terminal and double screen method for information display thereof
CN106502560A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 Display control method and device
CN106959796A (en) * 2017-03-22 2017-07-18 广东小天才科技有限公司 Mobile terminal screen display method and device
KR20170114477A (en) * 2016-04-05 2017-10-16 안재성 Mobile communication terminal and a call allocation service system using that and a method therof
CN107450872A (en) * 2017-06-26 2017-12-08 努比亚技术有限公司 A kind of split screen adjusting method, terminal and computer-readable recording medium
CN108920064A (en) * 2018-07-10 2018-11-30 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN109062466A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN109067981A (en) * 2018-07-11 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN109445572A (en) * 2018-09-10 2019-03-08 华为技术有限公司 The method, graphical user interface and terminal of wicket are quickly recalled in full screen display video

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102213212B1 (en) * 2014-01-02 2021-02-08 삼성전자주식회사 Controlling Method For Multi-Window And Electronic Device supporting the same
CN105224276A (en) * 2015-10-29 2016-01-06 维沃移动通信有限公司 A kind of multi-screen display method and electronic equipment
CN106708367A (en) * 2016-12-30 2017-05-24 维沃移动通信有限公司 Display method of conversation interface and mobile terminal
CN109032547A (en) * 2018-07-10 2018-12-18 Oppo广东移动通信有限公司 split screen processing method, device, storage medium and electronic equipment
CN109271121B (en) * 2018-08-31 2021-11-23 维沃移动通信有限公司 Application display method and mobile terminal
CN109917956B (en) * 2019-02-22 2021-08-03 华为技术有限公司 Method for controlling screen display and electronic equipment
CN110661917B (en) * 2019-08-07 2022-04-12 华为技术有限公司 Display method and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945115A (en) * 2012-10-18 2013-02-27 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN106062691A (en) * 2014-02-05 2016-10-26 三星电子株式会社 Apparatus and method of displaying windows
CN104793915A (en) * 2015-04-30 2015-07-22 魅族科技(中国)有限公司 Method and terminal for sub-screen display
CN105426097A (en) * 2015-10-30 2016-03-23 努比亚技术有限公司 Real time adjustment method for split screen size and split screen apparatus
KR20170114477A (en) * 2016-04-05 2017-10-16 안재성 Mobile communication terminal and a call allocation service system using that and a method therof
CN106101423A (en) * 2016-06-28 2016-11-09 努比亚技术有限公司 Split screen area size adjusting apparatus and method
CN106126065A (en) * 2016-06-29 2016-11-16 努比亚技术有限公司 A kind of mobile terminal and double screen method for information display thereof
CN106502560A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 Display control method and device
CN106959796A (en) * 2017-03-22 2017-07-18 广东小天才科技有限公司 Mobile terminal screen display method and device
CN107450872A (en) * 2017-06-26 2017-12-08 努比亚技术有限公司 A kind of split screen adjusting method, terminal and computer-readable recording medium
CN109062466A (en) * 2018-07-03 2018-12-21 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN108920064A (en) * 2018-07-10 2018-11-30 Oppo广东移动通信有限公司 Split screen window adjusting method, device, storage medium and electronic equipment
CN109067981A (en) * 2018-07-11 2018-12-21 Oppo广东移动通信有限公司 Split screen application switching method, device, storage medium and electronic equipment
CN109445572A (en) * 2018-09-10 2019-03-08 华为技术有限公司 The method, graphical user interface and terminal of wicket are quickly recalled in full screen display video

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021023021A1 (en) * 2019-08-07 2021-02-11 华为技术有限公司 Display method and electronic device
CN111274564A (en) * 2020-01-14 2020-06-12 青岛海信移动通信技术股份有限公司 Communication terminal and application unlocking method in split screen mode
CN114564101A (en) * 2020-06-19 2022-05-31 华为技术有限公司 Three-dimensional interface control method and terminal
CN114281439A (en) * 2020-09-18 2022-04-05 华为技术有限公司 Screen splitting method and device and electronic equipment
CN112764619A (en) * 2021-01-22 2021-05-07 联想(北京)有限公司 Window control method and electronic equipment
CN112764619B (en) * 2021-01-22 2023-03-21 联想(北京)有限公司 Window control method and electronic equipment
CN112835674A (en) * 2021-03-11 2021-05-25 田继伟 Multi-window display control method, system and storage medium thereof
CN112835674B (en) * 2021-03-11 2022-02-25 郑州千百视光电科技股份有限公司 Multi-window display control method, system and storage medium thereof
WO2022252786A1 (en) * 2021-05-31 2022-12-08 华为技术有限公司 Window split-screen display method and electronic device
CN113473013A (en) * 2021-06-30 2021-10-01 展讯通信(天津)有限公司 Display method and device for beautifying effect of image and terminal equipment
CN113805750A (en) * 2021-09-23 2021-12-17 闻泰通讯股份有限公司 Application program display method and device, mobile device and storage medium
CN114020226A (en) * 2021-10-26 2022-02-08 统信软件技术有限公司 Split screen processing method, computing device and readable storage medium
WO2023098182A1 (en) * 2021-12-03 2023-06-08 荣耀终端有限公司 Application interface display method and electronic device
CN116225287A (en) * 2021-12-03 2023-06-06 荣耀终端有限公司 Display method of application interface and electronic equipment
CN115562771A (en) * 2022-01-18 2023-01-03 荣耀终端有限公司 Application window management method, electronic device and computer readable storage medium
CN115562771B (en) * 2022-01-18 2023-11-24 荣耀终端有限公司 Application window management method, electronic device, and computer-readable storage medium
CN114554299A (en) * 2022-01-20 2022-05-27 海信视像科技股份有限公司 Display device and split-screen display method
WO2023226434A1 (en) * 2022-05-23 2023-11-30 Oppo广东移动通信有限公司 Interface display method and apparatus, and device, storage medium and program product
CN116048373A (en) * 2022-06-24 2023-05-02 荣耀终端有限公司 Display method of suspension ball control, electronic equipment and storage medium
CN116048373B (en) * 2022-06-24 2023-09-22 荣耀终端有限公司 Display method of suspension ball control, electronic equipment and storage medium
WO2024007966A1 (en) * 2022-07-05 2024-01-11 华为技术有限公司 Multi-window display method and device
WO2024027504A1 (en) * 2022-07-30 2024-02-08 华为技术有限公司 Application display method and electronic device
CN116048317A (en) * 2023-01-28 2023-05-02 荣耀终端有限公司 Display method and device
CN116048317B (en) * 2023-01-28 2023-08-22 荣耀终端有限公司 Display method and device

Also Published As

Publication number Publication date
CN110661917B (en) 2022-04-12
WO2021023021A1 (en) 2021-02-11
CN114840127A (en) 2022-08-02

Similar Documents

Publication Publication Date Title
CN110661917B (en) Display method and electronic equipment
WO2020238874A1 (en) Vr multi-screen display method and electronic device
CN110688179B (en) Display method and terminal equipment
WO2021052279A1 (en) Foldable screen display method and electronic device
CN111949345B (en) Application display method and electronic equipment
WO2020259470A1 (en) Display method for touch screen, and electronic device
CN110554816B (en) Interface generation method and device
WO2021063074A1 (en) Method for split-screen display and electronic apparatus
WO2021063097A1 (en) Display method and electronic equipment
WO2021036628A1 (en) Touch-control method for device with folding screen, and folding-screen device
WO2021169399A1 (en) Method for caching application interface, and electronic apparatus
CN110456951A (en) A kind of application display method and electronic equipment
WO2020228735A1 (en) Method for displaying application, and electronic device
WO2021129254A1 (en) Method for controlling display of screen, and electronic device
CN111897465B (en) Popup display method, device, equipment and storage medium
CN114666427B (en) Image display method, electronic equipment and storage medium
CN110647731A (en) Display method and electronic equipment
CN114513689B (en) Remote control method, electronic equipment and system
CN116048833B (en) Thread processing method, terminal equipment and chip system
CN112181915B (en) Method, device, terminal and storage medium for executing service
WO2021018248A1 (en) Interface display method and electronic device
CN111414563B (en) Webpage interaction method, device, computer equipment and storage medium
CN111522576B (en) Application management method, device, equipment and computer storage medium
WO2024193666A1 (en) Display method for electronic device, and electronic device and storage medium
CN117806420A (en) Display method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant