WO2023202407A1 - Procédé et appareil d'affichage d'application, et support de stockage - Google Patents

Procédé et appareil d'affichage d'application, et support de stockage Download PDF

Info

Publication number
WO2023202407A1
WO2023202407A1 PCT/CN2023/087337 CN2023087337W WO2023202407A1 WO 2023202407 A1 WO2023202407 A1 WO 2023202407A1 CN 2023087337 W CN2023087337 W CN 2023087337W WO 2023202407 A1 WO2023202407 A1 WO 2023202407A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
application
display
target application
content
Prior art date
Application number
PCT/CN2023/087337
Other languages
English (en)
Chinese (zh)
Inventor
刘�文
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023202407A1 publication Critical patent/WO2023202407A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of terminal technology, and in particular to an applied display method, device and storage medium.
  • a terminal displays a floating window of an application.
  • the title bar of the floating window is at a level in the view (English: View) tree as shown in Figure 2.
  • the application window 11 of the application includes a decorative view 12 (English: : Décor View), Decoration View 12 is the root view of the application and is the container of all views;
  • Free Form Caption View 13 (English: Free Form Caption View) is Caption View 14 (English: Caption View) and Content View 15 (English) : Content View)'s parent container;
  • title view 14 is the carrier of the title bar, including clickable buttons;
  • content view 15 is the carrier of application content.
  • the title view 14 and the content view 15 of the application are two views with the same level in the same layer, and are both subviews of the free-form title view 13. Since the title view 14 is located in the application process space and occupies the top position of the application, for some applications with poor compatibility, there may be a problem that the application content and the title bar block each other. A reasonable and effective technical solution has not yet been provided in the related art.
  • embodiments of the present application propose an application display method, device and storage medium.
  • the application content and title bar are decoupled. Spatial isolation is achieved, which can solve the problem of application content and title bar blocking each other, and greatly improve the display effect of floating windows.
  • embodiments of the present application provide an applied display method for use in electronic devices.
  • the method includes:
  • the first trigger instruction is used to trigger the target application to display in the floating window mode
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is the A subordinate layer of the second layer, the size of the first layer being larger than the size of the second layer;
  • the establishment by receiving the first trigger instruction for triggering the target application to display in the floating window mode, the establishment The first layer of the title bar of the target application is established.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is a subordinate layer of the second layer.
  • the size of one layer is larger than the size of the second layer, thereby displaying the floating window of the target application.
  • the floating window includes the first layer and the second layer; wherein, the first layer of the title bar and the second image of the application content
  • the layers are two different layers, that is, the application content and title bar are placed on different layers, so that the title bar of the floating window and the application content are decoupled, achieving spatial isolation, and solving the problem of application content and title bar
  • the problem of mutual occlusion greatly improves the display effect of floating windows.
  • the method before displaying the floating window of the target application, the method further includes:
  • the first layer and the second layer are bound.
  • the target area of the floating window displays a target control.
  • the target area is the area between the first layer and the second layer.
  • the target control includes at least one of a minimize button, a maximize button, and a close button.
  • the size of the first layer is larger than the size of the second layer. Therefore, after binding, the first layer and the second layer exist.
  • the non-overlapping area is the target area.
  • the target control of the title bar is displayed on the target area of the floating window.
  • the target control includes at least one of a minimize button, a maximize button and a close button, thereby ensuring that the function of the title bar of the floating window is realized. .
  • the method further includes:
  • a basic layer corresponding to the basic activity of the target application and the application content of the basic activity is established; the basic layer is the second layer.
  • binding the first layer and the second layer includes:
  • the first layer is bound to the base layer by re-parenting.
  • a basic layer corresponding to the basic activity of the target application and the application content of the basic activity is established, and the first layer and the basic layer are combined by reparenting.
  • the layers are bound so that the first layer has a binding relationship with the base layer to ensure the display effect when subsequent layers respond to mouse events (such as move events).
  • the method further includes:
  • the upper-layer activity of the target application When it is detected that the upper-layer activity of the target application is turned on, establish an upper-layer layer corresponding to the application content of the upper-layer activity, and the upper-layer activity is an activity at other levels except the basic activity;
  • an upper-layer layer corresponding to the application content of the upper-layer activity is established, the first layer that displays the title bar remains unchanged, and the basic layer is hidden and displayed. , and display the upper layer, while solving the compatibility problem of the floating window scene and achieving the normal display of the upper layer.
  • the method further includes:
  • the life cycle of the first layer is guaranteed to be consistent with the application content of the target application.
  • the life cycle of layers is consistent.
  • the method further includes:
  • the user interface display effect in the floating window scenario is further guaranteed.
  • the minimize instruction includes an instruction triggered by an operation acting on the minimize button of the first layer.
  • the method further includes:
  • further assurance is provided by undisplaying the first layer and each layer of the application content and/or destroying the first layer and each layer of the application content when it is detected that the target application is closed.
  • the life cycle of the first layer is consistent with the life cycle of the layer to which the content is applied.
  • the application content of the target application is the application content of at least two windows simultaneously displayed on the screen by the target application
  • the size of the second layer is the size of the at least two windows.
  • the overall window size of the window is the application content of the target application.
  • the application display method provided by the embodiments of the present application can also be applied to a multi-window display scenario.
  • the multi-window display scenario is at least two window scenarios simultaneously displayed on the screen by the target application.
  • the size of the second layer is the overall window size of at least two windows, that is, the size of the first layer of the title bar is larger than the overall window size of at least two windows, thereby avoiding multi-window display scenarios.
  • the application content and title bar block each other, ensuring the display effect of the title bar of the floating window in a multi-window display scenario.
  • the method further includes:
  • the mouse listening event includes at least one of a press event, a pop-up event, and a move event
  • the specified mouse event When a specified mouse event is detected, the specified mouse event is dispatched to the first layer through a Window Manager Service (WMS), where the specified mouse event includes at least one of the mouse listening events.
  • WMS Window Manager Service
  • the first layer is used to process the title bar business corresponding to the specified mouse event.
  • a set of mouse listening events is maintained for the first layer of the title bar.
  • the specified mouse event is dispatched to the first layer through the window management service, so that the first layer layer processing Specifying the title bar business corresponding to the mouse event avoids the situation in the related technology that the layer itself cannot respond to the mouse event, ensures that the title bar responds to the mouse event normally, and can achieve various user interface display effects.
  • the second layer includes a decoration view
  • the decoration view is a parent container of a content view
  • the content view is a carrier of application content of the target application.
  • the second layer includes a decorative view.
  • the decorative view is the parent container of the content view.
  • the content view is the carrier of the application content of the target application. This avoids the need for the title view and the content view of the application to be the same in related technologies.
  • the first layer of the title bar and the second layer of the content view of the application content belong to two different layers, that is, the application content and title bar are placed in different On the layer, spatial isolation is achieved and compatibility issues related to the title bar are solved.
  • inventions of the present application provide an applied display device for use in electronic equipment.
  • the device includes:
  • a receiving unit configured to receive a first trigger instruction, the first trigger instruction being used to trigger the target application to display in a floating window mode
  • a creation unit configured to create a first layer of the title bar of the target application.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is a subordinate layer of the second layer, and the size of the first layer is larger than the size of the second layer;
  • a display unit is configured to display a floating window of the target application, where the floating window includes the first layer and the second layer.
  • the device further includes:
  • a binding unit is used to bind the first layer and the second layer. After binding, the target area of the floating window displays a target control, and the target area is the first layer. In an area that does not overlap with the second layer, the target control includes at least one of a minimize button, a maximize button, and a close button.
  • the device further includes:
  • the establishment unit is also configured to establish a base layer corresponding to the basic activity of the target application and the application content of the basic activity upon receiving the start instruction corresponding to the target application; the base layer is the second layer.
  • the binding unit is also used to bind the first layer to the base layer by re-parenting.
  • the device further includes:
  • the establishment unit is also configured to establish an upper layer layer corresponding to the application content of the upper layer activity when it is detected that the upper layer activity of the target application is turned on.
  • the upper layer activity is other than the basic activity. Activities at other levels;
  • the display unit is also configured to keep the first layer displaying the title bar unchanged, hide and display the base layer, and display the upper layer.
  • the device further includes:
  • the binding unit is also used to unbind the first layer from the base layer, and bind the first layer to the upper layer.
  • the device further includes:
  • a hiding display unit configured to hide and display the first layer and each layer of the application content when receiving a minimization instruction for minimizing the floating window.
  • the minimize instruction includes an instruction triggered by an operation acting on the minimize button of the first layer.
  • the device further includes:
  • a display canceling unit configured to cancel the display of the first layer and each layer of the application content when it is detected that the target application is closed, and/or destroy the first layer and the Apply individual layers of content.
  • the application content of the target application is the application content of at least two windows simultaneously displayed on the screen by the target application
  • the size of the second layer is the size of the at least two windows.
  • the overall window size of the window is the application content of the target application.
  • the device further includes:
  • a registration unit configured to register a mouse listening event for the first layer, where the mouse listening event includes at least one of a press event, a pop-up event, and a move event;
  • a dispatching unit configured to dispatch the specified mouse event to the first layer through a window management service when a specified mouse event is detected, where the specified mouse event includes at least one of the mouse listening events,
  • the first layer is used to process the title bar business corresponding to the specified mouse event.
  • the second layer includes a decoration view
  • the decoration view is a parent container of a content view
  • the content view is a carrier of application content of the target application.
  • an applied display device which includes:
  • Memory used to store instructions executable by the processor
  • the display device implements the above method.
  • embodiments of the present application provide a non-volatile computer-readable storage medium on which computer program instructions are stored, characterized in that when the computer program instructions are executed by a processor, the above method is implemented.
  • embodiments of the present application provide a computer program product, which is characterized in that when the computer program product is run on a computer, the computer executes the above method.
  • embodiments of the present application provide an electronic device, where the electronic device includes:
  • Memory used to store instructions executable by the processor
  • the electronic device When the processor executes the instruction, the electronic device implements the above method.
  • Figure 1 shows a schematic diagram of a scene in which the title bar is obscured by application content in the related art.
  • FIG. 2 shows a schematic diagram of the hierarchy of the title bar of the floating window in the view tree in the related art.
  • Figure 3 shows a schematic structural diagram of an electronic device provided by an exemplary embodiment of the present application.
  • Figure 4 shows a software structure block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • FIG. 5 shows a flowchart of an application display method provided by an exemplary embodiment of the present application.
  • Figure 6 shows a schematic diagram of the hierarchical relationship in the view tree corresponding to the floating window provided by an exemplary embodiment of the present application.
  • Figure 7 shows a schematic diagram of the hierarchical relationship in the view tree corresponding to the floating window provided by another exemplary embodiment of the present application.
  • FIG. 8 shows a flowchart of an application display method provided by another exemplary embodiment of the present application.
  • FIG. 9 shows a schematic diagram of the principle of an application display method provided by an exemplary embodiment of the present application.
  • Figure 10 shows a schematic diagram of the binding method between the first layer and the second layer provided by an exemplary embodiment of the present application.
  • Figure 11 shows a schematic diagram of the binding method between the first layer and the second layer provided by another exemplary embodiment of the present application.
  • FIG. 12 shows a block diagram of an applied display device provided by an exemplary embodiment of the present application.
  • exemplary means "serving as an example, example, or illustrative.” Any embodiment described herein as “exemplary” is not necessarily to be construed as superior or superior to other embodiments.
  • the present application provides an application display method.
  • the application display method of the embodiment of the present application can be applied to electronic devices.
  • the floating window By placing the application content and title bar on different layers, the floating window can be made The title bar and the application content are decoupled to achieve spatial isolation, which can solve the problem of mutual obstruction between the application content and the title bar and greatly improve the display effect of the application.
  • the application display method provided by the embodiment of the present application can be applied to a variety of usage scenarios.
  • the various usage scenarios include but are not limited to: a scenario where an electronic device displays a floating window locally, or an electronic device displays a floating window on the screen (that is, projects the screen to other devices). monitor, other monitors display floating windows).
  • the electronic devices involved in the embodiments of the present application may be touch screen, non-touch screen, or without a screen.
  • touch screen devices the electronic device may be clicked or slid on the display screen with fingers, stylus, etc.
  • Devices are controlled.
  • Non-touch screen devices can be connected to input devices such as mice, keyboards, and touch panels, and electronic devices can be controlled through input devices.
  • Devices without screens can be, for example, Bluetooth speakers without screens.
  • the electronic devices in the embodiments of the present application may be smartphones, netbooks, tablets, laptops, wearable electronic devices (such as smart bracelets, smart watches, etc.), TVs, virtual reality devices, speakers, and electronic ink. etc.
  • FIG. 3 shows a schematic structural diagram of an electronic device provided by an exemplary embodiment of the present application. Taking the electronic device as a mobile phone as an example, FIG. 3 shows a schematic structural diagram of the mobile phone 200 .
  • the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, Mobile communication module 251, wireless communication module 252, audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone interface 270D, sensor module 280, button 290, motor 291, indicator 292, camera 293, display screen 294, and SIM Card interface 295, etc.
  • USB Universal Serial Bus
  • the sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the mobile phone 200 may also include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, and a magnetic sensor. , ambient light sensor, air pressure sensor, bone conduction sensor, etc., not shown in the figure).
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the mobile phone 200 .
  • the mobile phone 200 may include more or fewer components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units.
  • the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU) wait.
  • image signal processor, ISP image signal processor
  • controller may be the nerve center and command center of the mobile phone 200 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 210 may also be provided with a memory for storing instructions and data.
  • the memory in processor 210 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 210 . If the processor 210 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 210 is reduced, thus improving the efficiency of the system.
  • the processor 210 can run the application display method provided by the embodiment of the present application, so as to solve the problem of mutual occlusion of the application content and the title bar, and greatly improve the display effect of the application.
  • the processor 210 may include different devices. For example, when integrating a CPU and a GPU, the CPU and the GPU may cooperate to execute the application display method provided by the embodiments of the present application. For example, part of the algorithm in the application display method is executed by the CPU, and another part of the algorithm is executed by the GPU. Execute to get faster processing efficiency.
  • the display screen 294 is used to display images, videos, etc.
  • Display 294 includes a display panel.
  • the display panel can use liquid Liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light emitting diode or active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diodes (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (QLED), etc.
  • the mobile phone 200 may include 1 or N display screens 294, where N is a positive integer greater than 1.
  • the display screen 294 may be used to display information input by or provided to the user as well as various graphical user interfaces (GUI).
  • GUI graphical user interfaces
  • display 294 may display photos, videos, web pages, or files, etc.
  • display 294 may display a graphical user interface.
  • the graphical user interface includes a status bar, a hideable navigation bar, time and weather widgets (widgets), and application icons, such as browser icons.
  • the status bar includes the operator name (such as China Mobile), mobile network (such as 4G), time and remaining power.
  • the navigation bar includes the back key icon, home key icon and forward key icon.
  • the status bar may also include Bluetooth icons, Wi-Fi icons, external device icons, etc.
  • the graphical user interface may also include a Dock bar, and the Dock bar may include commonly used application icons, etc.
  • the processor 210 detects a touch event of the user's finger (or stylus, etc.) on an application icon, in response to the touch event, the user interface of the application corresponding to the application icon is opened and displayed on the display 294 The user interface of the application.
  • the display screen 294 can be an integrated flexible display screen, or a spliced display screen composed of two rigid screens and a flexible screen located between the two rigid screens.
  • the electronic device can control the display screen 294 to display the corresponding graphical user interface according to the application display method provided by the embodiment of the present application.
  • a camera 293 (either a front-facing camera or a rear-facing camera, or one camera can serve as both a front-facing and rear-facing camera) is used to capture still images or video.
  • the camera 293 may include photosensitive elements such as a lens group and an image sensor.
  • the lens group includes a plurality of lenses (convex lenses or concave lenses) for collecting light signals reflected by the object to be photographed, and transmitting the collected light signals to the image sensor. .
  • the image sensor generates an original image of the object to be photographed based on the light signal.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the processor 210 executes instructions stored in the internal memory 221 to execute various functional applications and data processing of the mobile phone 200 .
  • the internal memory 221 may include a program storage area and a data storage area.
  • the stored program area can store operating system, application program (such as camera application, WeChat application, etc.) codes, etc.
  • the storage data area can store data created during the use of the mobile phone 200 (such as images and videos collected by the camera application), etc.
  • the internal memory 221 may also store one or more computer programs corresponding to the application display method provided by the embodiment of the present application.
  • the one or more computer programs are stored in the above-mentioned memory 221 and configured to be executed by the one or more processors 210.
  • the one or more computer programs include instructions, and the above instructions can be used to execute the embodiments of the present application.
  • the computer program may include a receiving unit 51 , a creation unit 52 and a display unit 53 .
  • the receiving unit 51 is used to receive a first trigger instruction
  • the first trigger instruction is used to trigger the target application to use the floating window mode. display.
  • the creation unit 52 is used to create the first layer of the title bar of the target application.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is the second layer.
  • the size of the subordinate layers of the first layer is larger than the size of the second layer.
  • the display unit 53 is configured to display a floating window of the target application, where the floating window includes a first layer and a second layer.
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • non-volatile memory such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the code of the application display method provided by the embodiment of the present application can also be stored in an external memory.
  • the processor 210 may execute the code of the display method of the application stored in the external memory through the external memory interface 220 .
  • the functions of the sensor module 280 are described below.
  • the gyro sensor 280A can be used to determine the movement posture of the mobile phone 200 .
  • the angular velocity of the phone 200 about three axes ie, x, y, and z axes
  • the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still.
  • the gyro sensor 280A can be used to detect the folding or unfolding operation on the display screen 294 .
  • the gyro sensor 280A may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folding or unfolding state of the display screen 294 .
  • the acceleration sensor 280B can detect the acceleration of the mobile phone 200 in various directions (generally three axes). That is, the gyro sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B can be used to detect the folding or unfolding operation on the display screen 294 . The acceleration sensor 280B may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folding or unfolding state of the display screen 294 .
  • Proximity light sensor 280G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the mobile phone emits infrared light through light-emitting diodes.
  • Cell phones use photodiodes to detect reflected infrared light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the phone. When insufficient reflected light is detected, the phone can determine that there are no objects near the phone.
  • the proximity light sensor 280G can be disposed on the first screen of the foldable display screen 294, and the proximity light sensor 280G can detect the first screen according to the optical path difference of the infrared signal.
  • the gyro sensor 280A (or the acceleration sensor 280B) may send the detected motion state information (such as angular velocity) to the processor 210 .
  • the processor 210 determines whether the current state is a handheld state or a tripod state based on the motion state information (for example, when the angular velocity is not 0, it means that the mobile phone 200 is in a handheld state).
  • Fingerprint sensor 280H is used to collect fingerprints.
  • the mobile phone 200 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • Touch sensor 280K also called “touch panel”.
  • the touch sensor 280K can be disposed on the display screen 294.
  • the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen”.
  • Touch sensor 280K is used to detect the function Touch operations on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 294.
  • the touch sensor 280K may also be disposed on the surface of the mobile phone 200 in a position different from that of the display screen 294 .
  • the display screen 294 of the mobile phone 200 displays a main interface, which includes icons of multiple applications (such as a camera application, a WeChat application, etc.).
  • the user clicks the icon of the camera application in the main interface through the touch sensor 280K, triggering the processor 210 to start the camera application and opening the camera 293.
  • the display screen 294 displays the interface of the camera application, such as the viewfinder interface.
  • the wireless communication function of the mobile phone 200 can be realized through the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 200 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 251 can provide wireless communication solutions including 2G/3G/4G/5G applied on the mobile phone 200 .
  • the mobile communication module 251 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 251 can receive electromagnetic waves from the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 251 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 251 may be disposed in the processor 210 .
  • at least part of the functional modules of the mobile communication module 251 and at least part of the modules of the processor 210 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 270A, receiver 270B, etc.), or displays images or videos through display screen 294.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 210 and may be provided in the same device as the mobile communication module 251 or other functional modules.
  • the wireless communication module 252 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellite system. (global navigation satellite system, GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 252 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 252 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 252 can also receive the signal to be sent from the processor 210, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation. Wireless communication module 252 for use in processor 210 Transmit data with other electronic devices under control. For example, the processor 210 can control the wireless communication module 252 to send data to other electronic devices, and can also receive data sent by other electronic devices.
  • the mobile phone 200 can implement audio functions through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone interface 270D, and the application processor. Such as music playback, recording, etc.
  • the mobile phone 200 can receive key 290 input and generate key signal input related to user settings and function control of the mobile phone 200 .
  • the mobile phone 200 can use the motor 291 to generate vibration prompts (such as vibration prompts for incoming calls).
  • the indicator 292 in the mobile phone 200 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 295 in the mobile phone 200 is used to connect the SIM card.
  • the SIM card can be connected to and separated from the mobile phone 200 by inserting it into the SIM card interface 295 or pulling it out from the SIM card interface 295 .
  • the mobile phone 200 may include more or fewer components than shown in FIG. 3 , which is not limited by the embodiments of this application.
  • the illustrated handset 200 is only an example, and the handset 200 may have more or fewer components than shown, may combine two or more components, or may have a different component configuration.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • Software systems of electronic devices can adopt layered architecture, event-driven architecture, microkernel architecture, microservice architecture, or cloud architecture.
  • the embodiment of this application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device.
  • Figure 4 shows a software structure block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include phone, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • Telephone managers are used to provide communication functions of electronic devices. For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • FIG. 5 shows a flow chart of an application display method provided by an exemplary embodiment of the present application. This method can be used in the electronic device shown in FIG. 3 or FIG. 4 . The method includes the following steps.
  • Step 501 Receive a first trigger instruction, which is used to trigger the target application to display in the floating window mode.
  • the electronic device displays a first user interface of the target application, and a floating window control is displayed on the first user interface; the electronic device receives a first triggering instruction, and the first triggering instruction is a triggering instruction acting on the floating window control.
  • the target application is any application running in the electronic device.
  • the target applications are social applications, audio applications, video applications, etc.
  • the first user interface is a native page of the target application, and the first user interface includes the main interface of the target application or a lower-level interface of the main interface of the target application.
  • the floating window control is an operable control used to trigger the target application to display in floating window mode.
  • the type of floating window control includes at least one of buttons, controllable items, and sliders.
  • the first triggering instruction is a user operation signal displayed by the target application in a floating window mode.
  • the first triggering instruction includes any one or a combination of click operation signals, sliding operation signals, press operation signals, and long press operation signals.
  • the first triggering instruction can also be implemented in voice form. The embodiments of the present application are not limited to this.
  • the floating window mode described in the embodiments of this application can be a mode in which the application interface of the application is suspended and displayed on the current screen in the form of a window; or it can also be a mode in which the application window is not displayed in full screen and the application needs to be displayed.
  • the mode of the window's title bar When the application interface of an application is suspended and displayed on the screen in the form of a window, the window is called a floating window in the embodiment of this application.
  • the floating window can be moved and zoomed in and out freely.
  • Step 502 Create a first layer of the title bar of the target application.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is a subordinate image of the second layer. layers, the size of the first layer is larger than the size of the second layer.
  • the electronic device upon receiving the first triggering instruction, establishes the first layer of the title bar of the target application.
  • the electronic device when detecting that an activity of the target application is started, the electronic device establishes a second layer of application content of the activity.
  • the second layer can be a layer corresponding to the application content of the base activity of the target application, or it can be a layer corresponding to the application content of the upper-layer activity of the target application.
  • the upper-layer activity is an activity at other levels other than the basic activity.
  • steps of establishing the first layer and the steps of establishing the second layer may be executed sequentially, or may be executed simultaneously.
  • the embodiments of the present application perform the steps of establishing the first layer and establishing the second layer. The order is not limited.
  • the first layer is the layer corresponding to the title bar, that is, the first layer is the carrier of the title bar, and the first layer does not include elements of the application content.
  • the first layer includes a target control of the title bar, and the target control includes at least one of a minimize button, a maximize button, and a close button.
  • the second layer is the layer corresponding to the application content, that is, the second layer is the carrier of the application content, and the second layer does not include the title bar element.
  • the second layer includes a decoration view.
  • the decoration view is the parent container of the content view, and the content view is the carrier of the application content of the target application.
  • the first layer and the second layer are two different layers, the first layer is a subordinate layer of the second layer, and the size of the first layer is larger than the size of the second layer.
  • Step 503 Display the floating window of the target application.
  • the floating window includes a first layer and a second layer.
  • the electronic device displays the floating window of the target application based on the first layer and the second layer, that is, the electronic device displays the first layer and displays the first layer in the second layer.
  • a second layer is displayed superimposed on one layer, and the second layer is the superior layer of the first layer.
  • FIG. 6 it shows a schematic diagram of the hierarchical relationship in the view tree corresponding to the floating window provided by an exemplary embodiment of the present application.
  • the floating window of the target application includes a first layer 61 of the title bar and a second layer 62 of the application window.
  • the first layer 61 is the carrier of the title bar and includes clickable buttons (such as a maximize button and a close button).
  • the second layer 62 is the superior layer of the first layer 61, and the second layer 62 includes decoration View 622, decoration view 622 is the parent container of content view 623, and content view 623 is the carrier of the application content of the target application.
  • the application display method provided by the embodiment of the present application can also be applied to a multi-window display scenario.
  • the multi-window display scenario is at least two window scenarios simultaneously displayed on the screen by the target application.
  • the multi-window display scene can also be a multi-window screen projection scene.
  • the application content of the above-mentioned target application is the application content of at least two windows simultaneously displayed on the screen by the target application
  • the size of the second layer is the overall window size of at least two windows.
  • the floating window of the target application includes a first layer 71 of the title bar, a second layer 72 of window 1 of the target application, and a second layer 73 of window 2 of the target application.
  • the first layer 71 is the carrier of the title bar, which includes clickable buttons (such as the maximize button and the close button); the second layer 72 and the second layer 73 are both superior layers of the first layer 71, and the second layer 71 is the carrier of the title bar.
  • the second layer 73 includes a decoration view 732, which is the parent of the content view 733.
  • Container, content view 733 is the carrier of the application content of window 2.
  • the embodiment of the present application establishes the first layer of the title bar of the target application by receiving the first trigger instruction for triggering the target application to display in the floating window mode.
  • the first layer is related to the application content of the target application.
  • the second layer is two different layers.
  • the size of the first layer is larger than the size of the second layer, thereby displaying the floating window of the target application.
  • the floating window includes the first layer and is displayed overlaid on the first layer.
  • the second layer wherein, the first layer of the title bar and the second layer of the application content are two different layers, that is, the application content and the title bar are placed on different layers, so that the title of the floating window
  • the column and the application content are decoupled to achieve spatial isolation, which can solve the problem of mutual blocking of the application content and the title bar, and greatly improve the display effect of the floating window.
  • the title view is located in the application process space and occupies the top position of the application, for some applications with poor compatibility, there may be a problem that the application content and the title bar block each other.
  • the application may consume the click events in this area, causing the title bar to be unable to be clicked normally.
  • the title view is a top-level view, the background colors of many low-level views below it will be blocked, making it impossible to realize the function of the title bar being transparent to the desktop.
  • the border effect on the left and right bottom of the content view and title view cannot be achieved well.
  • the two windows correspond to different activities and have two different title views.
  • the controls in the window's title bar achieve a similar normalization effect, but they do not actually achieve normalization of the title bar.
  • the window projection of one window will cover the application content of another window.
  • the embodiment of the present application provides a layer-based solution to realize the function of the floating window title bar.
  • the title bar of the floating window and the application content are decoupled. Functionally, the title bar and the application content can be displayed in the view. Levels, spatial positions, and mouse events are isolated from each other and do not affect each other, and the device display effect is improved. And for applications that support multi-window display scenarios, when single-window or dual-window expansion is applied, the function of the title bar and the window projection can be normalized.
  • FIG. 8 shows a flow chart of an application display method provided by another exemplary embodiment of the present application. This method can be used in the electronic device as shown in FIG. 3 or FIG. 4 . The method includes the following steps.
  • Step 801 Create the first layer of the title bar of the target application and bind it to the second layer of the application content of the target application.
  • the electronic device after receiving the first triggering instruction for triggering the target application to display in the floating window mode, establishes the first layer of the title bar of the target application.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is a subordinate layer of the second layer.
  • the size of the first layer is larger than the size of the second layer. .
  • the second layer of the application content may be a base layer corresponding to the application content of the basic activity.
  • the electronic device after receiving the start instruction corresponding to the target application, the electronic device establishes the basic activity of the target application and the basic layer corresponding to the application content of the basic activity; after receiving the activation instruction for triggering the target application After the first trigger command is displayed in the floating window mode, the first layer of the title bar of the target application is established; the first layer is bound to the base layer by reparenting.
  • the title bar of the target application has global uniqueness.
  • a target application establishes a first layer of the title bar, that is, there is a one-to-one correspondence between the first layer of the title bar and the target application. After binding the first layer to the base layer, the life cycle of the first layer of the title bar is consistent with the life cycle of the base layer of the base activity.
  • the electronic device upon receiving a start command corresponding to the target application, establishes a basic activity.
  • the electronic device After receiving the first triggering instruction for triggering the target application to display in the floating window mode, establish the first layer of the title bar in the application window token (English: AppWindowToke) of the basic activity, and bind the first layer to Set to the layer control (English: SurfaceControl) of the application window token.
  • the electronic device when detecting that the upper-layer activity of the target application is turned on, the electronic device establishes an upper-layer layer corresponding to the application content of the upper-layer activity.
  • the upper-layer activity is an activity at other levels except the basic activity; the title bar remains displayed.
  • the first layer remains unchanged, the base layer is hidden and the upper layer is shown.
  • the electronic device after the electronic device establishes an upper layer corresponding to the application content of the upper layer activity, it unbinds the first layer from the base layer, and binds the first layer to the upper layer. .
  • the first layer of the title bar does not need to be changed. That is, during the running process of the target application, the binding relationship between the first layer of the title bar and the base layer of the target application remains unchanged, and the upper layer of the target application no longer Bind to the first layer.
  • the electronic device establishes a basic activity layer 91. After receiving the first triggering instruction, the electronic device creates a first layer 92 of the title bar of the target application.
  • the first layer 92 is a subordinate layer of the basic active layer 91 and the size of the first layer 92 is larger than the basic layer 91 .
  • the electronic device draws the target element of the title bar on the first layer.
  • the electronic device obtains the canvas (such as canvas) of the first layer; draws the target element on the first layer through the painting method of the canvas.
  • the target element includes at least the target control of the title bar, the window border and the projection element.
  • the target control of the title bar includes at least one of a minimize button, a maximize button, and a close button.
  • steps of drawing the target element of the title bar on the first layer and the steps of binding the first layer to the second layer can be performed in sequence (for example, draw first and then bind, or for example Bind first and then draw), or they can be executed at the same time, which is not limited in this application.
  • the electronic device after receiving the opening instruction corresponding to the target application, the electronic device establishes a basic activity and a second layer of the corresponding application content.
  • the size of the second layer is :2341dp*1124dp.
  • the electronic device After receiving the first triggering instruction for triggering the target application to display in the floating window mode, the electronic device establishes a first layer of the title bar of the target application.
  • the first layer is a subordinate layer of the second layer.
  • the first layer The size of the layer is: 2373dp*1132dp, that is, the size of the first layer is larger than the size of the second layer.
  • the electronic device binds the first layer to the base layer by reparenting, and draws the target control of the title bar on the first layer (the figure only illustrates that the target control is a close button as an example) .
  • the electronic device in a multi-window display scenario (only the electronic device displays two windows at the same time as an example), the electronic device establishes the second layer of window 1 of the target application. and the second layer of window 2.
  • the overall window size of these two windows is: 2341dp*1124dp.
  • the electronic device After receiving the first triggering instruction for triggering the target application to display in the floating window mode, the electronic device establishes a first layer of the title bar of the target application, and the first layer is a subordinate layer of the two second layers.
  • the size of the first layer is: 2373dp*1132dp, that is, the size of the first layer is larger than the overall window size of the two windows.
  • the electronic device binds the first layer to the two second layers by reparenting, and draws the target control of the title bar on the first layer (only the target control is used as the close button in the figure as an example) Example to illustrate).
  • Step 802 Register a mouse listening event for the first layer.
  • the mouse listening event includes at least one of a press event, a pop-up event, and a move event.
  • this embodiment of the present application maintains a set of mouse listening events for the first layer of the title bar.
  • the electronic device registers a mouse listening event for the first layer, and the mouse listening event includes at least one of a press event, a pop-up event, and a move event.
  • the press event is an event that performs a pressing operation on the target control
  • the pop-up event is an event that performs a pop-up operation on the target control
  • the move event is an event that performs a moving operation on the target control.
  • the target control is the first picture The control of the title bar displayed on the layer, and the target control includes at least one of a minimize button, a maximize button, and a close button.
  • steps of drawing the target element of the title bar on the first layer and the steps of registering mouse listening events for the first layer can be executed in order (for example, draw first and then register, or register first and then draw). , can also be executed simultaneously, and this application is not limited to this.
  • Step 803 Display the floating window of the target application.
  • the floating window includes a first layer and a second layer.
  • the electronic device displays the floating window of the target application based on the first layer and the second layer, that is, the electronic device displays the first layer, and The second layer is displayed superimposed on the first layer, and the second layer is the superior layer of the first layer.
  • the target area of the floating window displays a target control, and the target control includes a minimum At least one of the optimize button, maximize button, and close button.
  • Step 804 When a specified mouse event is detected, dispatch the specified mouse event to the first layer through the window management service.
  • the specified mouse event includes at least one of the mouse listening events.
  • the first layer is used to process the specified mouse event. Corresponding title bar business.
  • the electronic device after the electronic device displays the floating window of the target application, it monitors the mouse listening event in real time.
  • an event-driven mechanism is used to dispatch the specified mouse event to the first layer through the window management service, and the specified mouse event includes at least one of the mouse listening events.
  • the electronic device displays a floating window of the target application, and a minimize button, a maximize button and a close button are displayed in the title bar of the floating window.
  • a press event such as Action_Down
  • a pop-up event such as Action_Up
  • the electronic device displays a floating window of the target application, and a minimize button, a maximize button and a close button are displayed in the title bar of the floating window.
  • a press event such as Action_Down
  • a move event such as ActionMove
  • the electronic device when the electronic device receives a minimization instruction for minimizing the floating window, the electronic device hides and displays the first layer and each layer of the application content.
  • the minimize instruction includes an instruction triggered by an operation on the minimize button of the first layer.
  • the minimize instruction includes a click operation signal acting on the minimize button of the first layer.
  • the electronic device when detecting that the target application is closed, cancels the display of the first layer and each layer of the application content, and/or destroys the first layer and each layer of the application content. That is to say, when the electronic device detects that the target application is closed, it cancels the display of the first layer and each layer of the application content. Or, the electronic equipment is under inspection When it is detected that the target application is closed, cancel the display of the first layer and each layer of the application content, and destroy the first layer and each layer of the application content. Alternatively, when detecting that the target application is closed, the electronic device destroys the first layer and each layer of the application content.
  • the embodiments of this application provide a new application display solution, which adjusts the native title bar framework to achieve decoupling of the title bar and application content, solves the compatibility problem of the floating window scene, and enables Various user interface display effects.
  • the solution provided by the embodiments of the present application is used to decouple the title bar from the application content, solving problems in related technologies.
  • the title bar and application content block each other, the title bar cannot respond to mouse events, the title bar is not uniform in multi-window display scenarios, the window projection is abnormal, and the immersive upper hollow is entered.
  • Compatibility issues include various user interface display effects.
  • the title bar is decoupled from the application process space.
  • the application content and the title bar will be placed on different layers to achieve spatial isolation, which can solve the problem of mutual interaction between the application content and the title bar in related technologies.
  • the problem of mutual interception of occlusion, click or touch events; on the other hand, the normalization of the title bar in the multi-window display scene and the normal display of the window projection are realized; on the other hand, after the application enters the full-screen mode from the floating window, due to the title bar It does not occupy any application space, allowing for better compatibility after mode switching, and solving the problem in related technologies that some applications have hollowed-out upper layouts after mode switching; on the other hand, it can satisfy various user interface design solutions and facilitate the implementation of semi- Display effect of title bar and window border that is transparent or with Gaussian blur.
  • FIG. 12 shows a block diagram of an applied display device provided by an exemplary embodiment of the present application.
  • the device can be implemented as all or part of the electronic device through software, hardware, or a combination of both.
  • the device may include: a receiving unit 1210, an establishing unit 1220, and a display unit 1230.
  • the receiving unit 1210 is configured to receive a first trigger instruction, which is used to trigger the target application to display in the floating window mode;
  • the creation unit 1220 is used to create the first layer of the title bar of the target application.
  • the first layer and the second layer of the application content of the target application are two different layers.
  • the first layer is the second layer. of subordinate layers, the size of the first layer is larger than the size of the second layer;
  • the display unit 1230 is configured to display a floating window of the target application, where the floating window includes a first layer and a second layer.
  • the device further includes:
  • Binding unit is used to bind the first layer to the second layer. After binding, the target area of the floating window displays the target control.
  • the target area is the area in the first layer that does not overlap with the second layer.
  • the target control includes at least one of a minimize button, a maximize button, and a close button.
  • the device further includes:
  • the establishment unit 1220 is also configured to establish a basic layer corresponding to the basic activity of the target application and the application content of the basic activity upon receiving a start instruction corresponding to the target application; the basic layer is the second layer.
  • the binding unit is also used to bind the first layer to the base layer by reparenting.
  • the device further includes:
  • the establishment unit 1220 is also used to establish an upper layer corresponding to the application content of the upper layer activity when it is detected that the upper layer activity of the target application is turned on.
  • the upper layer activity is an activity at other levels except the basic activity;
  • the display unit 1230 is also used to keep the first layer displaying the title bar unchanged, hide and display the base layer, and display the upper layer.
  • the device further includes:
  • the binding unit is also used to unbind the first layer from the base layer and bind the first layer to the upper layer.
  • the device further includes:
  • the hiding display unit is configured to hide and display the first layer and each layer of the application content when receiving a minimization instruction for minimizing the floating window.
  • the minimization instruction includes an instruction triggered by an operation acting on the minimize button of the first layer.
  • the device further includes:
  • the display canceling unit is configured to cancel the display of the first layer and each layer of the application content, and/or destroy the first layer and each layer of the application content when it is detected that the target application is closed.
  • the application content of the target application is the application content of at least two windows displayed simultaneously on the screen by the target application
  • the size of the second layer is the overall window size of the at least two windows.
  • the device further includes:
  • a registration unit used to register mouse listening events for the first layer, where the mouse listening events include at least one of a press event, a pop-up event, and a move event;
  • the dispatching unit is used to dispatch the specified mouse event to the first layer through the window management service when the specified mouse event is detected.
  • the specified mouse event includes at least one of the mouse listening events.
  • the first layer is used to process the specified mouse event.
  • the second layer includes a decoration view
  • the decoration view is a parent container of the content view
  • the content view is a carrier of application content of the target application.
  • An embodiment of the present application provides an electronic device.
  • the electronic device includes: a processor; and is used to store executable data of the processor.
  • a memory of instructions wherein, when the processor executes the instructions, the electronic device implements the above method executed by the electronic device.
  • Embodiments of the present application provide a computer program product, which includes computer readable code, or a non-volatile computer readable storage medium carrying the computer readable code.
  • the computer readable code is run in a processor of an electronic device, , the processor in the electronic device executes the above method executed by the electronic device.
  • Embodiments of the present application provide a non-volatile computer-readable storage medium on which computer program instructions are stored.
  • the computer program instructions are executed by a processor, the above method executed by an electronic device is implemented.
  • Computer-readable storage media may be tangible devices that can retain and store instructions for use by an instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the above.
  • Non-exhaustive list of computer-readable storage media include: portable computer disks, hard drives, random access memory (RAM), read only memory (ROM), erasable memory Electrically Programmable Read-Only-Memory (EPROM or Flash Memory), Static Random-Access Memory (SRAM), Portable Compact Disc Read-Only Memory (CD) -ROM), Digital Video Disc (DVD), memory stick, floppy disk, mechanical encoding device, such as a punched card or a raised structure in a groove with instructions stored thereon, and any suitable combination of the above .
  • RAM random access memory
  • ROM read only memory
  • EPROM or Flash Memory erasable memory Electrically Programmable Read-Only-Memory
  • SRAM Static Random-Access Memory
  • CD Portable Compact Disc Read-Only Memory
  • DVD Digital Video Disc
  • memory stick floppy disk
  • mechanical encoding device such as a punched card or a raised structure in a groove with instructions stored thereon, and any suitable combination of the above .
  • Computer-readable program instructions or code described herein may be downloaded from a computer-readable storage medium to various computing/processing devices, or to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage on a computer-readable storage medium in the respective computing/processing device .
  • the computer program instructions used to perform the operations of this application may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or one or more Source code or object code written in any combination of programming languages, including object-oriented programming languages—such as Smalltalk, C++, etc., and conventional procedural programming languages—such as the “C” language or similar programming languages.
  • the computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server implement.
  • the remote computer can be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or it can be connected to an external computer (e.g. Use an Internet service provider to connect via the Internet).
  • electronic circuits are customized by utilizing state information of computer-readable program instructions, such as programmable logic circuits, field-programmable gate arrays (Field-Programmable Gate Arrays, FPGAs), or programmable logic arrays (Programmable logic circuits).
  • Logic Array PLA
  • this electronic circuit can execute computer-readable program instructions to implement various aspects of the present application.
  • These computer-readable program instructions may be provided to a processor of a general-purpose computer, a special-purpose computer, or other programmable data processing apparatus, thereby producing a machine that, when executed by the processor of the computer or other programmable data processing apparatus, , resulting in an apparatus that implements the functions/actions specified in one or more blocks in the flowchart and/or block diagram.
  • These computer-readable program instructions can also be stored in a computer-readable storage medium. These instructions cause the computer, programmable data processing device and/or other equipment to work in a specific manner. Therefore, the computer-readable medium storing the instructions includes An article of manufacture that includes instructions that implement aspects of the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • Computer-readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other equipment, causing a series of operating steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executed on a computer, other programmable data processing apparatus, or other equipment to implement the functions/actions specified in one or more blocks in the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions that contains one or more components for implementing the specified logical function(s).
  • Executable instructions may occur out of the order noted in the figures. For example, two consecutive blocks may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • each block of the block diagram and/or flowchart illustration, and combinations of blocks in the block diagram and/or flowchart illustration can be implemented by hardware (such as circuits or ASICs) that perform the corresponding function or action. Specific Integrated Circuit), or can be implemented with a combination of hardware and software, such as firmware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande se rapporte au domaine technique des terminaux et concerne, en particulier, un procédé et un appareil d'affichage d'application, et un support de stockage. Le procédé est utilisé dans un dispositif électronique. Le procédé consiste à : recevoir une première instruction de déclenchement, la première instruction de déclenchement étant utilisée pour déclencher une application cible à afficher en mode fenêtre flottante ; établir une première couche d'une barre de titre de l'application cible, la première couche et une seconde couche de contenu d'application de l'application cible formant deux couches différentes, la première couche étant une couche subordonnée à la seconde couche, et la taille de la première couche étant supérieure à la taille de la seconde couche ; et afficher une fenêtre flottante de l'application cible, la fenêtre flottante comprenant la première couche et la seconde couche. Dans des modes de réalisation de la présente demande, un contenu d'application et une barre de titre sont placés sur différentes couches, de telle sorte que la barre de titre et le contenu d'application d'une fenêtre flottante sont dissociés, ce qui permet une isolation spatiale. Ceci peut résoudre le problème d'un contenu d'application et d'une barre de titre se bloquant l'un l'autre, ce qui améliore considérablement l'effet d'affichage d'une fenêtre flottante.
PCT/CN2023/087337 2022-04-19 2023-04-10 Procédé et appareil d'affichage d'application, et support de stockage WO2023202407A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210410815.1 2022-04-19
CN202210410815.1A CN116954409A (zh) 2022-04-19 2022-04-19 应用的显示方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2023202407A1 true WO2023202407A1 (fr) 2023-10-26

Family

ID=88419082

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087337 WO2023202407A1 (fr) 2022-04-19 2023-04-10 Procédé et appareil d'affichage d'application, et support de stockage

Country Status (2)

Country Link
CN (1) CN116954409A (fr)
WO (1) WO2023202407A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117492609B (zh) * 2023-12-29 2024-05-17 荣耀终端有限公司 一种显示方法、可读存储介质、程序产品及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156999A (zh) * 2010-02-11 2011-08-17 腾讯科技(深圳)有限公司 一种用户界面的生成方法和装置
US20130151999A1 (en) * 2011-12-09 2013-06-13 International Business Machines Corporation Providing Additional Information to a Visual Interface Element of a Graphical User Interface
CN107193542A (zh) * 2017-03-30 2017-09-22 腾讯科技(深圳)有限公司 信息显示方法和装置
CN111949358A (zh) * 2020-08-18 2020-11-17 北京字节跳动网络技术有限公司 动态显示的方法、装置、可读介质和电子设备
CN113110910A (zh) * 2021-04-20 2021-07-13 上海卓易科技股份有限公司 一种安卓容器实现的方法、系统及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156999A (zh) * 2010-02-11 2011-08-17 腾讯科技(深圳)有限公司 一种用户界面的生成方法和装置
US20130151999A1 (en) * 2011-12-09 2013-06-13 International Business Machines Corporation Providing Additional Information to a Visual Interface Element of a Graphical User Interface
CN107193542A (zh) * 2017-03-30 2017-09-22 腾讯科技(深圳)有限公司 信息显示方法和装置
CN111949358A (zh) * 2020-08-18 2020-11-17 北京字节跳动网络技术有限公司 动态显示的方法、装置、可读介质和电子设备
CN113110910A (zh) * 2021-04-20 2021-07-13 上海卓易科技股份有限公司 一种安卓容器实现的方法、系统及设备

Also Published As

Publication number Publication date
CN116954409A (zh) 2023-10-27

Similar Documents

Publication Publication Date Title
WO2021159922A1 (fr) Procédé d'affichage de carte, dispositif électronique et support de stockage lisible par ordinateur
US9952681B2 (en) Method and device for switching tasks using fingerprint information
CN111666055B (zh) 数据的传输方法及装置
US20220308753A1 (en) Split-Screen Method and Electronic Device
US11853526B2 (en) Window display method, window switching method, electronic device, and system
CN111597000B (zh) 一种小窗口管理方法及终端
WO2022105759A1 (fr) Procédé et appareil de traitement vidéo, et support de stockage
US20240295945A1 (en) Method, electronic device, and system for creating application shortcut
EP4280058A1 (fr) Procédé d'affichage d'informations et dispositif électronique
EP4217842A1 (fr) Gestion de capture de contenu d'écran
WO2022052662A1 (fr) Procédé d'affichage et dispositif électronique
WO2020006669A1 (fr) Procédé de commutation d'icônes, procédé d'affichage de gui, et dispositif électronique
WO2023202407A1 (fr) Procédé et appareil d'affichage d'application, et support de stockage
WO2020259669A1 (fr) Procédé d'affichage de vue et dispositif électronique
WO2022134691A1 (fr) Procédé et dispositif de traitement de crissement dans un dispositif terminal, et terminal
WO2022001279A1 (fr) Procédé de gestion de bureau inter-dispositifs, premier dispositif électronique et second dispositif électronique
WO2022194005A1 (fr) Procédé et système de commande pour un affichage synchrone sur des dispositifs
WO2022105716A1 (fr) Procédé de commande de caméra basé sur une commande distribuée et équipement terminal
WO2022121751A1 (fr) Procédé et appareil de commande de caméra, et support de stockage
WO2024217159A1 (fr) Procédé de réponse pour un dispositif électronique, dispositif électronique et support de stockage
WO2022105755A1 (fr) Procédé et appareil de synchronisation de bibliothèques de polices de caractères, et support de stockage
US20240129619A1 (en) Method and Apparatus for Performing Control Operation, Storage Medium, and Control
WO2023036082A1 (fr) Système et procédé d'affichage et de commande d'une tâche de dispositif distant
WO2024125301A1 (fr) Procédé d'affichage et dispositif électronique
WO2024067142A1 (fr) Procédé d'affichage et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791062

Country of ref document: EP

Kind code of ref document: A1