WO2024149089A1 - 显示方法、显示装置和电子设备 - Google Patents

显示方法、显示装置和电子设备 Download PDF

Info

Publication number
WO2024149089A1
WO2024149089A1 PCT/CN2023/143194 CN2023143194W WO2024149089A1 WO 2024149089 A1 WO2024149089 A1 WO 2024149089A1 CN 2023143194 W CN2023143194 W CN 2023143194W WO 2024149089 A1 WO2024149089 A1 WO 2024149089A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
level menu
window
transparent window
level
Prior art date
Application number
PCT/CN2023/143194
Other languages
English (en)
French (fr)
Inventor
毛江平
周耀颖
刘洋洋
夏海琴
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024149089A1 publication Critical patent/WO2024149089A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Embodiments of the present application relate to the field of electronic devices, and more specifically, to a display method, a display device, and an electronic device.
  • Multi-level menus are widely used in mainstream operating systems because they can accommodate more options for users to choose from.
  • the current multi-level menu is mainly implemented by creating a new window through event triggering to accommodate the new level of menu.
  • each window creation consumes more system resources and increases the power consumption of the device, it brings a higher performance burden to low-configuration devices.
  • Embodiments of the present application provide a display method, a display device, and an electronic device, which can reduce the overhead and power consumption of multi-level menu display.
  • a display method which is applied to an electronic device, comprising: receiving a first input for expanding an n-th level menu of a multi-level menu; in response to the first input, displaying a user interface control in a transparent window, wherein the user interface control is the n-th level menu of the multi-level menu.
  • the user performs a first input in the first window.
  • the first input may be a user input for the nth level menu for expanding the multi-level menu, where n is a positive integer.
  • the nth level menu may be a first level menu or other level menus above the first level menu.
  • the first input may be, for example, a click input on a control in the first window.
  • the first input may also be, for example, a right click input on a blank area in the first window (for touch screen input, it may correspond to a long press click input on the first window).
  • the first window is the window where the user input acts when the first level menu is expanded.
  • the first input may be a hover input or a click input of a certain option on the n-1th level menu by the user.
  • the transparent window is dedicated to displaying a multi-level menu in the form of a user interface control.
  • the transparent window is invisible, while the multi-level menu in the form of a user interface control is visible. In this way, since each level of the menu in the multi-level menu is a user interface control, instead of creating a new window for each level menu expanded, the overhead and power consumption of the multi-level menu display can be reduced.
  • the multi-level menu is a user interface control on the transparent window rather than a control on the first window, and the size and position of the transparent window can be freely set according to needs, the display of the multi-level menu will not be limited to the first window, resulting in better display effects and enhanced user experience.
  • the first input is used to expand the first-level menu of the multi-level menu, the first input is located in the first window, and before displaying the user interface control in the transparent window, the method also includes: creating the transparent window above the first window based on the first input.
  • the first input when used to expand the first-level menu of a multi-level menu, the first input is located in the first window, and the multi-level menu is triggered by the user clicking a control or right-clicking a blank space in the first window.
  • the electronic device Before expanding the first-level menu, the electronic device creates a transparent window above the first window. Above means that the z level of the transparent window is higher than that of the first window, so that the multi-level menu is displayed as a user interface control on the transparent window at the top layer (the highest z level, the upper layer of the first window), and the transparent window is created to display the multi-level menu.
  • the method also includes: determining an input hot zone of the multi-level menu, the input hot zone being a collection of areas occupied by menus of each level that have been expanded in the multi-level menu; receiving a second input from a user to the transparent window; when the second input is located inside the input hot zone, performing an operation corresponding to the second input on the multi-level menu according to the second input; when the second input is located outside the input hot zone and the second input is a click input, according to the second input, closing the multi-level menu and the transparent window, and activating the window with the highest z level under the transparent window at the location of the second input.
  • the transparent window is displayed on the top layer, when the user input is within the range of the transparent window, the transparent window is the direct action window of the input.
  • the transparent window is created to accommodate the user interface control of the multi-level menu. Menu, the other parts of the window are not visible to the user.
  • the input hot zone is a set of areas formed by the expanded multi-level menu. The user input in this area is used for the multi-level menu, such as for expanding and folding the multi-level menu.
  • the user clicks or hovers on an option of the current level menu it can be used to expand the next level menu; when the user hovers or clicks on the option of the previous level menu of the current level menu, it can be used to expand the current level menu in other ways; when the user clicks on the option corresponding to a specific instruction, the corresponding instruction can be triggered.
  • the user inputs in other areas of the transparent window when the user input is a click input, other windows can be activated (the window with the highest z level under the transparent window at the click input position), and the click input can close the transparent window and the user interface controls on the transparent window (menus at all levels of the multi-level menu).
  • the click input can be directly transmitted to the window under the transparent window to trigger the corresponding instruction, such as the control of the window at the click position.
  • the input is hovering outside the input hot zone of the transparent window, no other operations may be performed. Therefore, the existence of the transparent window in the technical solution of the embodiment of the present application will not affect the user's operation.
  • an area occupied by the transparent window is greater than or equal to an area where the multi-level menu may be displayed.
  • the area occupied by the transparent window is greater than or equal to the possible display area of the multi-level menu
  • the possible display area of the multi-level menu includes not only the display area of each level of the menu currently displayed on the display screen, but also the display area of any level of the menu that may be displayed although not displayed on the display screen. Since the multi-level menu exists in the form of a user interface control on the transparent window, the size of the transparent window needs to be sufficient to accommodate each level of the multi-level menu.
  • the possible display area of the multi-level menu is related to the position of the input that triggers expansion of a first-level menu of the multi-level menu.
  • the possible display area of the multi-level menu is related to the position of the input that triggers the expansion of the first-level menu of the multi-level menu, that is, the area occupied by the transparent window may also be related to the position of the input.
  • the area occupied by the transparent window is consistent with the screen display area of the electronic device.
  • the transparent window in order to reduce the complexity of creating a transparent window, can be set to be consistent with the display interface of the electronic device.
  • the multi-level menu triggered or managed by the first window can be displayed in the area of the entire display screen without being affected by the boundary of the first window, so that the display effect of the multi-level menu is better. For example, if the first window is small, when the multi-level menu is displayed entirely within the range of the first window, some levels of the multi-level menu may need to overlap, affecting the display effect.
  • the first window is a non-full-screen display window, and a partial area of the transparent window is outside an area occupied by the first window.
  • the multi-level menu can also be displayed outside the first window, so that the display effect of the multi-level menu is better.
  • the user interface control is displayed according to a rendering result of the transparent window.
  • the operating system of the electronic device since the operating system of the electronic device only needs to render the transparent window instead of rendering the window of each level of the multi-level menu when displaying the multi-level menu, the power consumption of displaying the multi-level menu can be reduced.
  • the operating system of the electronic device is a Windows operating system, an Android operating system, an IOS operating system, or a Linux operating system.
  • the first input when n is 1, the first input is a user click operation on the first control in the first window; or, the first input is a user right-click or long-press click operation on a blank area in the first window; when n is greater than or equal to 2, the first input is an operation of hovering over an option of the n-1th level menu of the multi-level menu for more than a preset time; or, the first input is a click operation on an option on the n-1th level menu of the multi-level menu.
  • the first-level menu can be expanded by clicking the control in the first window or right-clicking in the blank area.
  • the click operation on the first control can be a mouse click (for example, a right-click or a left-click) or a touch click operation (for example, a short-click operation or a long-click operation on the first control).
  • the n-level menu is a second-level menu or a menu above the second-level menu
  • the n-level menu can be triggered by clicking or hovering the mouse over a specific option on the n-1-level menu.
  • a display device comprising: one or more processors; one or more memories; the one or more memories storing one or more computer programs, the one or more computer programs comprising instructions, when the instructions When executed by the one or more processors, the display device performs the following steps: receiving a first input for expanding an n-th level menu of a multi-level menu; in response to the first input, displaying a user interface control in a transparent window, wherein the user interface control is the n-th level menu of the multi-level menu.
  • the first input is used to expand the first-level menu of the multi-level menu, and the first input is located in the first window.
  • the display device performs the following steps: based on the first input, create the transparent window above the first window.
  • the display device when the instruction is executed by the one or more processors, performs the following steps: determining an input hot zone of the multi-level menu, the input hot zone being a collection of areas occupied by menus of each level that have been expanded in the multi-level menu; receiving a second input from a user to the transparent window; when the second input is located inside the input hot zone, performing an operation corresponding to the second input on the multi-level menu according to the second input; or, when the second input is located outside the input hot zone and the second input is a click input, closing the multi-level menu and the transparent window according to the second input, and activating the window with the highest z level under the transparent window at the location of the second input.
  • an area occupied by the transparent window is greater than or equal to a possible display area of the multi-level menu.
  • the possible display area of the multi-level menu is related to the position of the first-level menu input that triggers expansion of the multi-level menu.
  • an area occupied by the transparent window is consistent with a screen display area of the display device.
  • the first window is a non-full-screen display window, and a partial area of the transparent window is outside an area occupied by the first window.
  • the user interface control is displayed based on a rendering result of the transparent window.
  • the operating system of the display device is a Windows operating system, an Android operating system, an IOS operating system, or a Linux operating system.
  • the first input when n is 1, the first input is a user click operation on the first control in the first window; or, the first input is a user right-click or long-press click operation on a blank area in the first window; when n is greater than or equal to 2, the first input is an operation of hovering over an option of the n-1th level menu of the multi-level menu for more than a preset time; or, the first input is a click operation on an option on the n-1th level menu of the multi-level menu.
  • a display device comprising: a processing unit for receiving a first input for expanding an n-th level menu of a multi-level menu; and a display unit for displaying a user interface control in the transparent window in response to the first input, wherein the user interface control is the n-th level menu of the multi-level menu.
  • the first input is used to expand the first-level menu of the multi-level menu, the first input is located in the first window, and the processing unit is further used to: create the transparent window above the first window based on the first input.
  • the processing unit is further used to: determine an input hot zone of the multi-level menu, the input hot zone being a collection of areas occupied by menus of each level that have been expanded in the multi-level menu; receive a second input from a user to the transparent window; when the second input is located inside the input hot zone, perform an operation corresponding to the second input on the multi-level menu according to the second input; or, when the second input is located outside the input hot zone and the second input is a click input, close the multi-level menu and the transparent window according to the second input, and activate the window with the highest z level under the transparent window at the location of the second input.
  • an area occupied by the transparent window is greater than or equal to a possible display area of the multi-level menu.
  • the possible display area of the multi-level menu is related to the position of the input that triggers the expansion of the first-level menu of the multi-level menu.
  • the area occupied by the transparent window is consistent with the screen display area of the electronic device.
  • the first window is a non-full-screen display window, and a partial area of the transparent window is outside an area occupied by the first window.
  • the user interface control is displayed based on a rendering result of the transparent window.
  • the operating system of the display device is a Windows operating system, an Android operating system, an IOS operating system, or a Linux operating system.
  • the first input when n is 1, the first input is a user click operation on the first control in the first window; or, the first input is a user right-click or long-press click operation on a blank area in the first window; when n is greater than or equal to 2, the first input is an operation of hovering over an option of the n-1th level menu of the multi-level menu for more than a preset time; or, the first input is a click operation on an option on the n-1th level menu of the multi-level menu.
  • an electronic device comprising the display device as described in the second aspect or any one of the implementations of the second aspect and the third aspect or any one of the implementations of the third aspect.
  • a computer storage medium is provided, and when the computer instructions are executed on an electronic device, the electronic device executes the method as described in the first aspect or any one of the implementations of the first aspect.
  • an electronic device comprising: a memory for storing computer instructions; and a processor for executing the computer instructions stored in the memory, so that the electronic device executes the method described in the first aspect or any one of the implementations of the first aspect.
  • a chip system characterized in that it includes at least one processor, and when the program instructions are executed in the at least one processor, the at least one processor executes the method described in the first aspect or any one implementation of the first aspect.
  • a chip comprising a processor and a data interface, wherein the processor reads instructions stored in a memory through the data interface to execute the method described in the first aspect and any possible implementation of the first aspect.
  • the chip may further include a memory, wherein the memory stores instructions.
  • the processor is used to execute instructions stored in the memory. When the instructions are executed, the processor is used to execute the method described in the first aspect and any possible implementation manner of the first aspect.
  • the above chip may specifically be a field programmable gate array or a dedicated integrated circuit.
  • FIG1 is a schematic diagram of the hardware structure of an electronic device to which an embodiment of the present application is applicable.
  • FIG. 2 is a schematic diagram of the software structure of an electronic device to which an embodiment of the present application is applicable.
  • FIG. 3 is a schematic diagram showing a graphical user interface of a multi-level menu display.
  • FIG. 4 shows a schematic flow chart of a display method provided in an embodiment of the present application.
  • FIG5 shows a graphical user interface provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram showing different expansion methods of a multi-level menu provided in an embodiment of the present application.
  • FIG. 7 shows a graphical user interface provided in an embodiment of the present application.
  • FIG8 shows a schematic block diagram of a display device provided in an embodiment of the present application.
  • FIG. 9 shows a schematic block diagram of another display device provided in an embodiment of the present application.
  • a and/or B can represent: A exists alone, A and B exist at the same time, and B exists alone, where A and B can be singular or plural.
  • the character "/” generally indicates that the objects associated before and after are in an "or” relationship.
  • references to "one embodiment” or “some embodiments” etc. described in this specification mean that a particular feature, structure or characteristic described in conjunction with the embodiment is included in one or more embodiments of the present application.
  • the phrases “in one embodiment”, “in some embodiments”, “in some other embodiments”, “in some other embodiments”, etc. appearing in different places in this specification do not necessarily all refer to the same embodiment, but mean “one or more but not all embodiments", unless otherwise specifically emphasized in other ways.
  • the terms “including”, “comprising”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized in other ways.
  • the method provided in the embodiments of the present application is applied to electronic devices, including but not limited to mobile phones, tablet computers, vehicle-mounted devices, wearable devices, augmented reality (AR)/virtual reality (VR) devices, laptop computers, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (PDA), smart screens, and other electronic devices with display screens.
  • electronic devices including but not limited to mobile phones, tablet computers, vehicle-mounted devices, wearable devices, augmented reality (AR)/virtual reality (VR) devices, laptop computers, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (PDA), smart screens, and other electronic devices with display screens.
  • the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • FIG1 shows a schematic diagram of the hardware structure of an electronic device provided in an embodiment of the present application.
  • the electronic device 100 may include: a processor 110, a memory 120, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a camera 191, a display screen 192, a button 193, etc.
  • USB universal serial bus
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a memory, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU).
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • controller a memory
  • video codec a digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller may generate an operation control signal according to the instruction operation code and the timing signal to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory, thereby avoiding repeated access, reducing the waiting time of the processor 110, and thus improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the processor 110 and the touch sensor 180E can communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the processor 110 and the camera 191 can communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 192 can communicate through a DSI interface to implement the display function of the electronic device 100.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect to the battery 142. While the charging management module 140 is charging the battery 142, the power management module 141 can also be used to power the electronic device 100.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, and battery health status.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, and filter, amplify, and process the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some of the functional modules of the mobile communication module 150 can be set in the processor 110.
  • at least some of the functional modules of the mobile communication module 150 can be set in the same device as at least some of the modules of the processor 110.
  • the electronic device 100 implements the display function through a GPU, a display screen 192, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 192 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 192 is used to display images, videos, etc.
  • the display screen 192 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 192, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through ISP, camera 191, video codec, GPU, display screen 192 and application processor.
  • ISP is used to process the data fed back by camera 191.
  • Camera 191 is used to capture static images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
  • the electronic device 100 may include 1 or N cameras 191, where N is a positive integer greater than 1.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the memory 120 is used to store data and/or instructions.
  • the memory 120 may include an internal memory.
  • the internal memory is used to store computer executable program codes, which include instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running the instructions stored in the internal memory.
  • the internal memory may include a program storage area and a data storage area. Among them, the program storage area may store an operating system; the program storage area may also store one or more application programs (such as a gallery, contacts, etc.).
  • the data storage area may store data (such as images, contacts, etc.) created during the use of the electronic device 100.
  • the internal memory may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 may execute the card sharing method provided in the embodiment of the present application by running the instructions stored in the internal memory and/or the instructions stored in the memory provided in the processor 110.
  • the memory 120 may also include an external memory, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory may communicate with the processor 110 via an external memory interface to implement a data storage function. For example, files such as music and videos may be stored in the external memory.
  • the electronic device 100 can implement audio functions such as audio playback and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an acceleration sensor 180C, a distance sensor 180D, a touch sensor 180E, and some other sensors.
  • the pressure sensor 180A is used to sense pressure signals and can convert pressure signals into electrical signals.
  • the pressure sensor 180A can be set on the display screen 192.
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the touch operation intensity based on the pressure sensor 180A.
  • the electronic device 100 can also calculate the touch position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is performed When used for the short message application icon, executes the instruction of creating a new short message.
  • the gyroscope sensor 180B is also called an angular velocity sensor, which can be used to determine the motion posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
  • the gyroscope sensor 180B can be used for anti-shake shooting.
  • the gyroscope sensor 180B can also be used for navigation and somatosensory game scenes. For example, the gyroscope can fully monitor the displacement of the player's hand, thereby achieving various game operation effects, such as changing the horizontal screen to the vertical screen, turning in a racing game, and so on.
  • the distance sensor 180D is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180D to measure the distance to achieve fast focusing.
  • the touch sensor 180E is also called a "touch panel”.
  • the touch sensor 180E can be set on the display screen 192, and the touch sensor 180E and the display screen 192 form a touch screen, also called a "touch screen”.
  • the touch sensor 180E is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 192.
  • the touch sensor 180E can also be set on the surface of the electronic device, which is different from the position of the display screen 192.
  • the sensor module 180 may include more or fewer sensors, depending on actual needs, which will not be described in detail here.
  • the key 193 may include a power key, a volume key, etc.
  • the key 193 may be a mechanical key or a touch key.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the above describes a possible hardware structure diagram of the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking the system as an example, the software structure of the electronic device 100 is exemplified.
  • FIG2 shows a software structure diagram of an electronic device provided in an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor.
  • the layers communicate with each other through software interfaces.
  • the system is divided into four layers, from top to bottom: application layer, application framework layer, system runtime layer and kernel layer. Below the kernel layer is the hardware layer.
  • the application layer may include a series of application (APP) packages. As shown in FIG2 , the application package may include camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • APP application
  • the application framework layer provides application programming interface (API) and programming framework for the applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 2, the application framework layer may include a window manager, a content provider, a phone manager, a resource manager, a notification manager, a view system, a card service engine, etc.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the window manager can obtain the size of the window to be displayed on the electronic device 100, determine the content of the window to be displayed, etc.
  • the window to be displayed on the electronic device 100 may include the window being displayed on the interface of the electronic device 100, and may also include the windows of one or more application programs running in the background of the electronic device 100.
  • the window manager is a service. It is a global, unique, and independent C++ service in the system.
  • the window manager is used by all applications.
  • the window management system is based on the client/server (C/S) model.
  • the entire window system is divided into two parts: the service and the client.
  • the client is the application, which is responsible for requesting the creation of windows and using windows;
  • the server is the window manager service (window manager service or WindowManagerService, WMS), which is responsible for completing window maintenance and window display.
  • the client does not interact directly with the window manager service, but directly interacts with the local object window manager (window manager or WindowManager), and then the window manager (WindowManager) completes the interaction with the window manager service (WindowManagerService). This interaction is transparent to the application, and the application cannot perceive the existence of the window manager service.
  • a window is a rectangular area on the screen that can display a user interface (UI) and interact with the user.
  • the window can also hide the user interface (i.e., the software operation interface) and quickly display it to the user when the user needs to operate.
  • a window is actually a canvas (surface).
  • a screen can have multiple windows, and the layout and order of these multiple windows and window animations are managed by the window management service WMS, and the mixing and display of multiple canvas contents are implemented by the SurfaceFlinger service.
  • Windows are layered, and windows with larger layers will cover windows with smaller layers.
  • the stacking relationship of windows can be represented by the z level, and windows with larger z levels (higher z levels) cover windows with smaller z levels (lower z levels).
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, the terminal device vibrates, the indicator light flashes, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying images, etc.
  • the view system can be used to build applications.
  • a display interface can be composed of one or more views.
  • a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
  • the system runtime library layer (libraries) can be divided into two parts: system libraries and Android runtime.
  • Android runtime is the Android operating environment, including the core library and virtual machine. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one is the function that the Java language needs to call, and the other is the Android core library.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library is the support of the application framework and may include multiple functional modules, such as: surface manager, media libraries, two-dimensional graphics engine (such as SGL), three-dimensional graphics processing library (such as OpenGL ES), image processing library, etc.
  • the surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis and layer processing, etc.
  • a 2D graphics engine is a drawing engine for 2D drawings.
  • the kernel layer is the layer between hardware and software, which is used to provide essential functions of the operating system such as file management, memory management, process management, network protocol stack, etc.
  • the kernel layer includes at least display driver, camera driver, audio driver, sensor driver, Bluetooth driver, etc.
  • a multi-level menu consisting of multiple hierarchical menus is widely used on terminal devices because it can accommodate more options for users to choose from.
  • a user can right-click to trigger the opening of window 312, which includes the first-level menu of the multi-level menu.
  • window 314 which includes the second-level menu of the multi-level menu.
  • the electronic device needs to create a new window to accommodate the next-level menu.
  • a new window is created, more system resources will be consumed. Therefore, when the number of levels of menus is large, displaying a multi-level menu consumes more resources.
  • an embodiment of the present application provides a display method, which can be applied to an electronic device, and the electronic device can have the structure shown in Figures 1 and 2.
  • the display method can reduce the resources consumed when creating and displaying a multi-level menu and improve the user experience.
  • the method includes:
  • S420 In response to the first input, display a user interface control in the transparent window, where the user interface control is an nth level menu of the multi-level menu.
  • n is a positive integer.
  • the first input is a user input that triggers opening the multi-level menu (expanding the first-level menu).
  • the first input may be a right-click of a mouse in a blank area of the first window, a left-click or a right-click of a mouse on a menu item on the window interface, or a mouse input that triggers opening the multi-level menu.
  • the electronic device can predict the display area of a multi-level menu according to the position of the user's first input.
  • the position of the menu control in the first window can change with the display status of the first window (full screen or non-full screen, and the specific size of the non-full screen).
  • the electronic device can determine the distance between the menu control and each boundary of the display screen display interface, and determine the display mode of each level of the multi-level menu according to the distance.
  • the multi-level menu as a three-level menu as an example, when the boundary of the display screen display interface does not affect the expansion of the multi-level menu, in order to improve the user experience, the default expansion mode can be set to align the first option of the next-level menu with the selected option of the current-level menu, and the next-level menu is displayed on the right side of the current-level menu.
  • FIG5 shows a graphical user interface provided by an embodiment of the present application.
  • the boundary of the display screen interface affects the expansion of the multi-level menu
  • at least one level of the multi-level menu can be expanded to the left, that is, the next level menu is displayed on the left side of the previous level menu.
  • the distance from the right side of the second level menu to the right boundary of the display screen is short, and the third level menu cannot be displayed on the right side of the second level menu, so the third level menu can be displayed on the left side of the second level menu.
  • the selected option of the current menu can be aligned with the last option of the next level menu when the next level menu is expanded.
  • the second level menu 512 is displayed on the left side of the first level menu 511, and when the user hovers the mouse or clicks option 7.2 of the second level menu, due to the influence of the lower boundary of the display screen display interface, the first option 7.2.1 of the third level menu cannot be aligned with the selected option 7.2 of the second level menu, and the last option 7.10 of the third level menu can be aligned with the selected option 7.2 of the second level menu, so that each level of the multi-level menu can be displayed within the range of the display screen display interface.
  • the multi-level menu is expanded in different ways (different expansion modes brought about by selecting different options, for example, option 1, option 2, option 3, and option 7 of the first-level menu can trigger the expansion of the second-level menu, which are different expansion modes of the second-level menu), the above-mentioned different expansion modes may be adopted: for example, the selected option of the current-level menu is aligned with the first option or the last option of the next-level menu (the selected option of the current-level menu can also be aligned with other options of the next-level menu or the middle position of the next-level menu), and the next-level menu is displayed on the right or left side of the current-level menu, etc.
  • the multi-level menu is triggered by clicking on the menu control 514 on the window 510.
  • the distance between the menu control 514 and the border of the display screen display interface can be determined by the electronic device, thereby further determining the expansion mode of the multi-level menu in different expansion modes and the display area in different expansion modes.
  • the first level menu of the multi-level menu will correspond to the click position, for example, the upper left corner, lower left corner, upper right corner, lower right corner or other reference points of the first level menu may coincide with the click position.
  • the electronic device may determine the distance between the click position and the boundary of the display interface of the display screen in a manner similar to that described above, and further determine the display mode of each level menu, and finally determine the display area when the multi-level menu is expanded in different expansion modes.
  • the electronic device when receiving the first input of the user, the electronic device can predict the possible display area of the multi-level menu according to the position of the first input.
  • n is greater than or equal to 2
  • the first level menu has been expanded before, and the possible display area of the multi-level menu can still be determined in the manner described above.
  • the electronic device may display a user interface control in a transparent window in response to the first input, where the user interface control is an nth level menu of a multi-level menu. That is, when a user expands any level menu of a multi-level menu, each level menu is displayed in the form of a user interface control on a transparent window. In this way, when displaying a multi-level menu, the operating system of the electronic device only needs to render the transparent window layer to which the multi-level menu belongs, thereby reducing the power consumption caused by displaying the multi-level menu.
  • the electronic device first creates a transparent window above the first window according to the first input.
  • a transparent window is created, and the z level of the transparent window is the highest (higher than the first window).
  • the size of the transparent window can be minimized under the premise that the generated transparent window covers the possible display area of the multi-level menu.
  • the transparent window area is equal to or slightly larger than the possible display area of the multi-level menu.
  • the boundary of the transparent window can be extended by several pixels compared to the boundary of the possible display area of the multi-level menu (maximum extension to the boundary of the screen display area of the electronic device).
  • the area occupied by the created transparent window is related to the position of the input that triggers the expansion of the first-level menu.
  • the electronic device in order to reduce complexity, can directly set the area of the transparent window to be as large as the screen display area of the electronic device.
  • Figure 6 shows two expansion methods of a multi-level menu, wherein expansion method 1 includes a two-level menu, a first-level menu 602 and a second-level menu 604 formed by further expansion of option 1, including options 1.1 to 1.5.
  • the expansion mode 2 includes four levels of menus, and the option 7 of the first level menu 602 is further expanded to form the second level menu 606, the option 7.2 of the second level menu 606 is further expanded to form the third level menu 608, and the option 7.2.2 of the third level menu is further expanded to form the fourth level menu 610.
  • the size of the transparent window can make the menus of all levels be displayed within the range of the transparent window when the multi-level menu is expanded in different modes, that is, the area occupied by the transparent window is greater than or equal to the possible display area of the multi-level menu.
  • the first level menu of the multi-level menu can be displayed to the user in the form of the user interface control of the transparent window.
  • the transparent window is invisible, the multi-level menu generated in the form of the user interface control of the transparent window is visible.
  • the multi-level menu is expanded one level at a time according to the user's input.
  • the z-level of the generated transparent window is the highest (when the multi-level menu is not completely folded). Therefore, when the position of the subsequent user input is within the range of the transparent window, the user input directly acts on the transparent window.
  • the electronic device will also determine the input hot zone of the multi-level menu in real time.
  • the input hot zone is a collection of areas occupied by menus of each level that have been expanded in the multi-level menu. When the electronic device receives user input within the range of the transparent window, it can further determine the position of the input.
  • the electronic device can correspondingly activate other windows under the transparent window (the window with the highest z-level under the transparent window at the click position), and close the transparent window and the multi-level menu on the transparent window, so that the transparent window generated specifically for the multi-level menu will not have a negative impact on the user experience.
  • the click input can be directly passed to the corresponding window, and the application corresponding to the window determines whether to further execute other instructions, for example, the click position can trigger the corresponding instruction or evoke the corresponding function (when there is a control on the window at the click position, etc.).
  • FIG. 7 (a) to FIG. 7 (c) sequentially show schematic diagrams of the graphical user interface (GUI) of the first level menu-second level menu-third level menu in the multi-level menu.
  • GUI graphical user interface
  • the user clicks the control 711 in FIG. 7 (a) it will trigger the generation of a transparent window 713 and a first user interface control, and the first user interface control is the first level menu of the multi-level menu.
  • the input hot zone can be determined as shown in the thick frame 712.
  • the user's subsequent input is within the input hot zone, it can be determined that the input is for the multi-level menu, and the user input can be correspondingly transmitted to the application of the first window 710 corresponding to the multi-level menu, and the application determines whether the user input is used to expand the second level menu or to call the function corresponding to the corresponding option or to fold a certain level menu of the multi-level menu.
  • the user's input may be within the range of the input hot zone 712.
  • the user's input may be a click input on option 3 in the first-level menu or the mouse hovering over option 3 for a period exceeding a preset threshold.
  • the user input may be transmitted to the application to which the window 710 belongs.
  • the application determines that the target of the input is to trigger the opening of the next-level menu, and then generates a user interface control corresponding to the next-level menu.
  • the input hot zone will be updated as the second-level menu is displayed, as shown in the thick frame 712', which is the merged area occupied by the two-level menus.
  • the user can continue to click option 3.2 of the second level menu or hover the mouse over option 3.2 for more than a preset time, triggering the expansion of the third level menu, and the input hotspot is updated to 712" as shown in (c) of FIG. 7 .
  • the type of user input can be determined according to the location of the user input to perform specific operations.
  • the user input is a click input.
  • the click input is located at point A, since the window under the transparent window at point A is window 710, the input can activate window 710.
  • the electronic device can pass the input to the application of window 710; when the click input is located at point B, since the window under the transparent window at point B is 720, the electronic device will activate window 720 according to the click input.
  • the input can be passed to the application of window 720; when the click input is at point C, since the window under the transparent window at point C is window 730, the user input can activate window 730. In some embodiments, the input will be passed to the application of window 730 for further processing. At the same time, if the user clicks outside the input hot zone, the multi-level menu will be closed, and the previously generated transparent window will also be closed. That is, when the user input outside the input hot zone is a click input, It can be directly used to close transparent windows and multi-level menus and activate the window with the highest z-level under the transparent window at the click position. In some embodiments, the input after the corresponding window is activated will be passed to the window. In other embodiments, the input of the activated window can be passed to the corresponding window for further processing by the application of the window.
  • the electronic device can create a separate transparent window for the multi-level menu, and the transparent windows corresponding to different multi-level menus can have different sizes and positions.
  • the expansion, folding and execution of specific instructions of the multi-level menu are all determined by the application where the window that triggers the multi-level menu is located.
  • the multi-level menu is a user interface control on the transparent window, it is actually managed by the application where the window that triggers the multi-level menu is located.
  • the embodiment of the present application enables the multi-level menu to be displayed to the user in the form of a user interface control on a transparent window, and the size of the transparent window can be determined according to the display requirements of the menu, therefore, on the one hand, the display of the multi-level menu will not be restricted by the boundary of the first window that triggers or manages the multi-level menu.
  • the size of the transparent window is set appropriately and the first window is a non-full-screen display window, part of the transparent window can be outside the area occupied by the first window, and the corresponding multi-level menu can be displayed outside the first window.
  • the multi-level menu When the area displayed by the first window is small, if the multi-level menu is displayed inside the first window, some levels of the multi-level menu will overlap, and the user will not be able to intuitively see the expanded menus at all levels, which is inconvenient for the user to operate.
  • the operating system of the electronic device since the multi-level menus are displayed in the form of user interface controls, the operating system of the electronic device only needs to render one layer of the transparent window when rendering the multi-level menu. Compared with using a separate window for each level of the menu (each window layer needs to be rendered separately), the system overhead and power consumption are reduced, and the user experience is improved.
  • the technical solution introduced in the embodiments of the present application can be applied to electronic devices using different operating systems, for example, operating system, operating system, operating system, operating system, An operating system, etc., as long as the operating system needs to use a multi-level menu, can reduce the power consumption of the multi-level menu so that the display of the multi-level menu is not limited by the scope of the original window, thereby improving the user experience.
  • Figure 8 shows a display device 800 provided in an embodiment of the present application, wherein the display device 800 includes a processing unit 810 and a display unit 820.
  • the processing unit 810 can be used to execute step 410 of the method embodiment shown in Figure 4, and the display unit 820 can be used to execute step S420 of the method embodiment shown in Figure 4.
  • the display device 800 includes: a processing unit 810, used to receive a first input for expanding an n-th level menu of a multi-level menu; a display unit 820, used to display a user interface control in a transparent window in response to the first input, wherein the user interface control is the n-th level menu of the multi-level menu.
  • the first input is used to expand a first-level menu of the multi-level menu
  • the first input is located in the first window
  • the processing unit is further used to: create the transparent window above the first window according to the first input.
  • the processing unit 810 is also used to: determine an input hot zone of the multi-level menu, which is a collection of areas occupied by menus of each level that have been expanded in the multi-level menu; receive a second input from the user to the transparent window; when the second input is inside the input hot zone, perform an operation corresponding to the second input on the multi-level menu according to the second input; or, when the second input is outside the input hot zone and the second input is a click input, close the multi-level menu and the transparent window according to the second input, and activate the window with the highest z level under the transparent window at the location of the second input.
  • the area occupied by the transparent window is greater than or equal to the possible display area of the multi-level menu.
  • the possible display area of the multi-level menu is related to the position of the input that triggers the expansion of the first-level menu of the multi-level menu.
  • the area occupied by the transparent window is consistent with the screen display area of the electronic device.
  • the first window is a non-full screen display window, and a partial area of the transparent window is outside the area occupied by the first window.
  • the user interface control is displayed based on a rendering result of the transparent window.
  • the operating system of the display device 800 is a Windows operating system, an Android operating system, an IOS operating system, or a Linux operating system.
  • the first input when n is 1, the first input is a user click operation on the first control in the first window; or, the first input is a right-click or long-press click operation of the user on a blank area in the first window; when n is greater than or equal to 2, the first input is an operation of hovering over an option of the n-1th level menu of the multi-level menu for more than a preset time; or, the first input is a click operation on an option on the n-1th level menu of the multi-level menu.
  • FIG9 shows a display device provided by an embodiment of the present application.
  • the display device 900 shown in FIG9 may correspond to the display device described above.
  • the display device 900 may be a specific example of the display device in FIG8 .
  • the display device 900 includes: a processor 920.
  • the processor 920 is used to implement the corresponding control management operation.
  • the processor 920 is used to support the display device 900 to perform the method, operation or function of the aforementioned embodiment.
  • the display device 900 may also include: a memory 910 and a communication interface 930; the processor 920, the communication interface 930 and the memory 910 may be interconnected or connected to each other through a bus 940.
  • the communication interface 930 is used to support the display device to communicate with other devices, etc.
  • the memory 910 is used to store the program code and data of the display device.
  • the processor 920 calls the code or data stored in the memory 910 to implement the corresponding operation.
  • the memory 910 may be coupled to the processor or not.
  • the coupling in the embodiment of the present application is an indirect coupling or communication connection between display devices, units or modules, which may be electrical, mechanical or other forms, and is used for information interaction between display devices, units or modules.
  • the processor 920 can be a central processing unit, a general processor, a digital signal processor, an application-specific integrated circuit, a field programmable gate array or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It can implement or execute various exemplary logic blocks, modules and circuits described in conjunction with the disclosure of this application.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a digital signal processor and a microprocessor, and the like.
  • the communication interface 930 can be a transceiver, a circuit, a bus, a module or other types of communication interfaces.
  • the bus 940 can be a peripheral component interconnect standard (PCI) bus or an extended industry standard architecture (EISA) bus, etc.
  • PCI peripheral component interconnect standard
  • EISA extended industry standard architecture
  • the bus can be divided into an address bus, a data bus, a control bus, etc. For ease of representation, only one thick line is used in FIG9, but it does not mean that there is only one bus or one type of bus.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application can be essentially or partly embodied in the form of a software product that contributes to the prior art.
  • the computer software product is stored in a storage medium and includes several instructions for a computer device (which can be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), disk or optical disk, and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Digital Computer Display Output (AREA)
  • Power Sources (AREA)

Abstract

本申请实施例提供了一种显示方法、显示装置和电子设备,该显示方法包括:接收用于展开多级菜单的第n级菜单的第一输入;响应于第一输入,在透明窗口中显示第一用户界面控件,第一用户界面控件是多级菜单的第n级菜单。通过本申请实施例提供的显示方法、显示装置和电子设备,能够降低多级菜单显示的开销和功耗。

Description

显示方法、显示装置和电子设备
本申请要求于2023年01月12日提交中国专利局、申请号为202310072062.2、申请名称为“显示方法、显示装置和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子设备领域,更具体地,涉及一种显示方法、显示装置和电子设备。
背景技术
多级菜单由于能容纳较多的选项供用户选择,在主流操作系统上得到广泛应用。
目前的多级菜单主要实现方案为通过事件触发创建新窗口,用于容纳新一级的菜单。但是由于每次创建窗口都会消耗较多的系统资源,提升设备的功耗,对于配置较低的设备来说,带来了较高的性能负担。
发明内容
本申请实施例提供一种显示方法、显示装置和电子设备,能够降低多级菜单显示的开销和功耗。
第一方面,提供了一种显示方法,应用于电子设备,包括:接收用于展开多级菜单的第n级菜单的第一输入;响应于所述第一输入,在透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
在该实施例中,用户在第一窗口进行第一输入,第一输入可以是用于展开多级菜单的第n级菜单的用户输入,n为正整数,该第n级菜单可以是第一级菜单,也可以是第一级菜单以上的其他级菜单。当该第n级菜单为第一级菜单时,第一输入例如可以为对第一窗口内控件的点击输入,第一输入还可以例如为在第一窗口内空白区域的鼠标右键点击输入等(对于触屏输入可以对应在第一窗口上长按点击输入),第一窗口是展开第一级菜单时用户输入作用的窗口。当该第n级菜单为第一级菜单以上的其他级菜单时,第一输入可以为用户在第n-1级菜单上某一选项的悬停输入或点击输入。透明窗口是专用于显示用户界面控件形式的多级菜单的,透明窗口是不可见的,而用户界面控件形式的多级菜单是可见的。这样,由于多级菜单中的各级菜单是用户界面控件,而非每展开一级菜单就要创建一个新窗口,能够降低多级菜单显示的开销和功耗。另外,由于多级菜单是透明窗口上的用户界面控件而非第一窗口上的控件,而透明窗口的尺寸和位置可以根据需求自由设置,多级菜单的显示不会被局限在该第一窗口内,使得显示效果更好,提升用户的体验感。
结合第一方面,在第一方面的一些实现方式中,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,在所述在透明窗口中显示用户界面控件之前,所述方法还包括:根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
在该实施例中,当第一输入用于展开多级菜单的第一级菜单时,第一输入位于第一窗口内,多级菜单是用户在第一窗口内点击控件或者空白处右键点击触发的,在展开第一级菜单之前,电子设备会在第一窗口上方创建透明窗口,上方意味着透明窗口的z层级高于该第一窗口,从而多级菜单作为透明窗口上的用户界面控件显示在最上层(z层级最高,第一窗口上层),透明窗口是为了显示多级菜单而创建的。
结合第一方面,在第一方面的一些实现方式中,所述方法还包括:确定所述多级菜单的输入热区,所述输入热区是所述多级菜单已展开的各级菜单所占区域的集合;接收用户对所述透明窗口的第二输入;当所述第二输入位于所述输入热区内部时,根据所述第二输入,对所述多级菜单执行与所述第二输入对应的操作;当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
在该实施例中,由于该透明窗口显示在最上层,当用户的输入在透明窗口的范围内时,透明窗口是该输入的直接作用窗口,但是透明窗口是用于容纳多级菜单的用户界面控件而创建的,除了展开的多级 菜单,该窗口的其他部分对用户不可见,为了管理用户在透明窗口上的输入,需要确定输入热区,输入热区是已展开的多级菜单形成的区域集合,在该区域的用户输入是用于针对多级菜单的,例如用于对多级菜单展开、折叠等。示意性的,用户在当前级菜单的某个选项上点击或悬停,可以用于展开下一级菜单;而当用户在当前级菜单的前级菜单的选项上悬停或点击,可以用于以其他方式展开当前级菜单;当用户在对应特定指令的选项点击时,可以触发对应指令。而当用户在透明窗口的其他区域进行输入时,当该用户输入为点击输入时,可以激活其他窗口(点击输入位置透明窗口下z层级最高的窗口),同时该点击输入可以关闭透明窗口以及透明窗口上的用户界面控件(多级菜单的各级菜单),另外,在一些实施例中,该点击输入可以直接被传递给透明窗口下的窗口从而触发相应的指令,例如点击位置有该窗口的控件等。当该输入为在透明窗口的输入热区外的悬停时,可以不进行其他操作。从而本申请实施例的技术方案中的透明窗口的存在不会对用户的操作产生影响。
结合第一方面,在第一方面的一些实现方式中,所述透明窗口所占区域大于或等于所述多级菜单可能显示区域。
在该实施例中,透明窗口所占区域大于或等于多级菜单的可能显示区域,多级菜单的可能显示区域不仅包括当前显示在显示屏上的每一级菜单的显示区域,还包括虽然未显示在显示屏上但是可能显示的任一级菜单的显示区域。由于多级菜单是以透明窗口上的用户界面控件的形式存在,因此透明窗口的大小需要足够容纳多级菜单的每一级菜单。
结合第一方面,在第一方面的一些实现方式中,所述多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单的输入的位置有关。
在该实施例中,多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单的输入的位置有关,也就是说,透明窗口所占的区域也可以与该输入的位置有关。
结合第一方面,在第一方面的一些实现方式中,所述透明窗口所占区域与所述电子设备的屏幕显示区域一致。
在该实施例中,为了减少透明窗口创建的复杂度,可以将该透明窗口设置为与电子设备的显示界面一致,这样,无论原第一窗口显示为何种大小,第一窗口触发或管理的多级菜单都可以在整个显示屏的区域内显示而不用受到第一窗口的边界的影响,使得多级菜单的显示效果更好,例如如果第一窗口较小时,当将多级菜单全部显示在第一窗口的范围内时,可能多级菜单的部分级菜单需要位置重叠,影响显示效果。
结合第一方面,在第一方面的一些实现方式中,所述第一窗口为非全屏显示窗口,所述透明窗口的部分区域在所述第一窗口所占的区域的外部。
在该实施例中,当第一窗口为非全屏显示窗口时,透明窗口的部分区域在第一窗口所占区域的外部,多级菜单也可以显示在第一窗口的外部,使得多级菜单的显示效果更好。
结合第一方面,在第一方面的一些实现方式中,所述用户界面控件是根据对所述透明窗口的渲染结果显示的。
在该实施例中,由于在显示多级菜单时,电子设备的操作系统仅需要对透明窗口进行渲染而非针对多级菜单的每一级菜单的窗口进行渲染,能够减少显示多级菜单的功耗。
结合第一方面,在第一方面的一些实现方式中,所述电子设备的操作系统是Windows操作系统、Android操作系统、IOS操作系统或者Linux操作系统。
结合第一方面,在第一方面的一些实现方式中,当所述n为1时,所述第一输入是用户对所述第一窗口内的第一控件的点击操作;或者,所述第一输入是用户在所述第一窗口内空白区域鼠标右键点击或者长按点击操作;当所述n大于或等于2时,所述第一输入是在所述多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,所述第一输入是对所述多级菜单的第n-1级菜单上的选项的点击操作。
当第n级菜单为第一级菜单时,可以通过点击第一窗口内的控件或者在空白区域右键点击展开第一级菜单,对第一控件的点击操作可以为鼠标点击(例如鼠标右键点击、鼠标左键点击),也可以为触控点击操作(例如对第一控件的短按点击操作或长按点击操作)。
当第n级菜单为第二级或第二级以上菜单时,第n级菜单可以通过对第n-1级菜单上特定选项的点击或鼠标悬停触发。
第二方面,提供了一种显示装置,所述显示装置包括:一个或多个处理器;一个或多个存储器;所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令 被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:接收用于展开多级菜单的第n级菜单的第一输入;响应于所述第一输入,在透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
结合第二方面,在第二方面的一些实现方式中,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,当所述指令被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
结合第二方面,在第二方面的一些实现方式中,当所述指令被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:确定所述多级菜单的输入热区,所述输入热区是所述多级菜单已展开的各级菜单所占区域的集合;接收用户对所述透明窗口的第二输入;当所述第二输入位于所述输入热区内部时,根据所述第二输入,对所述多级菜单执行与所述第二输入对应的操作;或者,当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
结合第二方面,在第二方面的一些实现方式中,所述透明窗口所占区域大于或等于所述多级菜单的可能显示区域。
结合第二方面,在第二方面的一些实现方式中,所述多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单输入的位置有关。
结合第二方面,在第二方面的一些实现方式中,所述透明窗口所占区域与所述显示装置的屏幕显示区域一致。
结合第二方面,在第二方面的一些实现方式中,所述第一窗口为非全屏显示窗口,所述透明窗口的部分区域在所述第一窗口所占的区域的外部。
结合第二方面,在第二方面的一些实现方式中,所述用户界面控件是根据对所述透明窗口的渲染结果显示的。
结合第二方面,在第二方面的一些实现方式中,所述显示装置的操作系统是Windows操作系统、Android操作系统、IOS操作系统或者Linux操作系统。
结合第二方面,在第二方面的一些实现方式中,当所述n为1时,所述第一输入是用户对所述第一窗口内的第一控件的点击操作;或者,所述第一输入是用户在所述第一窗口内空白区域鼠标右键点击或者长按点击操作;当所述n大于或等于2时,所述第一输入是在所述多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,所述第一输入是对所述多级菜单的第n-1级菜单上的选项的点击操作。
第三方面,提供了一种显示装置,包括:处理单元,用于接收用于展开多级菜单的第n级菜单的第一输入;显示单元,用于响应于所述第一输入,在所述透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
结合第三方面,在第三方面的一些实现方式中,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,所述处理单元还用于:根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
结合第三方面,在第三方面的一些实现方式中,所述处理单元,还用于:确定所述多级菜单的输入热区,所述输入热区是所述多级菜单已展开的各级菜单所占区域的集合;接收用户对所述透明窗口的第二输入;当所述第二输入位于所述输入热区内部时,根据所述第二输入,对所述多级菜单执行与所述第二输入对应的操作;或者,当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
结合第三方面,在第三方面的一些实现方式中,所述透明窗口所占区域大于或等于所述多级菜单的可能显示区域。
结合第三方面,在第三方面的一些实现方式中,所述多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单的输入的位置有关。
结合第三方面,在第三方面的一些实现方式中,所述透明窗口所占区域与所述电子设备的屏幕显示区域一致。
结合第三方面,在第三方面的一些实现方式中,所述第一窗口为非全屏显示窗口,所述透明窗口的部分区域在所述第一窗口所占的区域的外部。
结合第三方面,在第三方面的一些实现方式中,所述用户界面控件是根据对所述透明窗口的渲染结果显示的。
结合第三方面,在第三方面的一些实现方式中,所述显示装置的操作系统是Windows操作系统、Android操作系统、IOS操作系统或者Linux操作系统。
结合第三方面,在第三方面的一些实现方式中,当所述n为1时,所述第一输入是用户对所述第一窗口内的第一控件的点击操作;或者,所述第一输入是用户在所述第一窗口内空白区域鼠标右键点击或者长按点击操作;当所述n大于或等于2时,所述第一输入是在所述多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,所述第一输入是对所述多级菜单的第n-1级菜单上的选项的点击操作。
第四方面,提供了一种电子设备,所述电子设备包括如第二方面或第二方面任一种实现方式以及第三方面或第三方面任一种实现方式所述的显示装置。
第五方面,提供了一种计算机存储介质,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如第一方面或第一方面任一种实现方式所述的方法。
第六方面,提供了一种电子设备,包括:存储器,用于存储计算机指令;处理器,用于执行所述存储器中存储的计算机指令,以使所述电子设备执行如第一方面或第一方面任一种实现方式所述的方法。
第七方面,提供了一种芯片系统,其特征在于,包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得所述至少一个处理器执行如第一方面或第一方面任一种实现方式所述的方法。
第八方面,提供了一种芯片,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,执行上述第一方面以及第一方面的任一种可能实现方式所述的方法。
可选地,作为一种实现方式,所述芯片还可以包括存储器,所述存储器中存储有指令,
所述处理器用于执行所述存储器上存储的指令,当所述指令被执行时,所述处理器用于执行上述第一方面以及第一方面的任一种可能实现方式所述的方法。
上述芯片具体可以是现场可编程门阵列或者专用集成电路。
附图说明
图1是本申请实施例适用的电子设备的硬件结构示意图。
图2是本申请实施例适用的电子设备的软件结构示意图。
图3示出了多级菜单显示的图形用户界面的示意图。
图4示出了本申请实施例提供的一种显示方法的示意性流程图。
图5示出了本申请实施例提供的一种图形用户界面。
图6示出了本申请实施例提供的多级菜单不同展开方式的示意图。
图7示出了本申请实施例提供的一种图形用户界面。
图8示出了本申请实施例提供的一种显示装置的示意性框图。
图9示出了本申请实施例提供的另一种显示装置的示意性框图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个、两个或两个以上。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请实施例提供的方法应用于电子设备,电子设备包括但不限于手机、平板电脑、车载设备、可穿戴设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、智慧屏以及其他具有显示屏的电子设备。本申请实施例对电子设备的具体类型不作任何限制。
示例性的,图1示出了本申请实施例提供的一种电子设备的硬件结构示意图。
如图1所示,电子设备100可以包括:处理器110,存储器120,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,摄像头191,显示屏192,按键193等。
处理器110可以包括一个或多个处理单元。例如,处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从该存储器中直接调用,避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
例如,处理器110与触摸传感器180E可以通过I2C总线接口通信,实现电子设备100的触摸功能。处理器110和摄像头191可以通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏192可以通过DSI接口通信,实现电子设备100的显示功能。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。电源管理模块141用于连接电池142。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备100供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态等参数。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏192,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏192和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏192用于显示图像,视频等。显示屏192包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏192,N为大于1的正整数。
电子设备100可以通过ISP,摄像头191,视频编解码器,GPU,显示屏192以及应用处理器等实现拍摄功能。ISP用于处理摄像头191反馈的数据。摄像头191用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。在一些实施例中,电子设备100可以包括1个或N个摄像头191,N为大于1的正整数。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
存储器120用于存储数据和/或指令。
存储器120可以包括内部存储器。内部存储器用于存储计算机可执行程序代码,该可执行程序代码包括指令。处理器110通过运行存储在内部存储器的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如图像,联系人等)等。此外,内部存储器可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。在一些实施例中,处理器110可以通过运行存储在内部存储器的指令,和/或存储在设置于处理器110中的存储器的指令,使得电子设备100执行本申请实施例中所提供的卡片分享方法。
存储器120还可以包括外部存储器,例如Micro SD卡,以扩展电子设备100的存储能力。外部存储器可以通过外部存储器接口与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音频播放,录音等。
传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,加速度传感器180C,距离传感器180D,触摸传感器180E以及其他的一些传感器等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏192。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏192,电子设备100根据压力传感器180A检测触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作 用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B又称角速度传感器,可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景,例如陀螺仪能够完整监测游戏者手的位移,从而实现各种游戏操作效果,如横屏改竖屏、赛车游戏拐弯等等。
加速度传感器180C可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。加速度传感器180C还可以用于识别电子设备100的姿态,应用于横竖屏切换,计步器等应用。
距离传感器180D,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180D测距以实现快速对焦。
触摸传感器180E,也称“触控面板”。触摸传感器180E可以设置于显示屏192,由触摸传感器180E与显示屏192组成触摸屏,也称“触控屏”。触摸传感器180E用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏192提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180E也可以设置于电子设备的表面,与显示屏192所处的位置不同。
应理解,本申请实施例中,对于传感器模块180包括的传感器类型并不作限定,传感器模块180可以包括更多或者更少的传感器,具体可以根据实际需要而定,在此不再详述。
按键193可以包括开机键,音量键等。按键193可以是机械按键,也可以是触摸式按键。电子设备100可接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件、软件或软件和硬件的组合实现。
以上介绍了电子设备100可能的硬件结构示意图。电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的系统为例,示例性说明电子设备100的软件结构。
图2示出了本申请实施例提供的一种电子设备的软件结构框图。如图2所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将系统分为四层,从上至下分别为应用程序层,应用程序框架层,系统运行库层以及内核层。内核层之下则为硬件层。
应用程序层可以包括一系列应用程序(application,APP)包。如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,电话管理器,资源管理器,通知管理器、视图系统、卡片服务引擎等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。示例性的,窗口管理器可以获取电子设备100上待显示窗口的大小,判断该待显示窗口的内容等。应理解,电子设备100的待显示窗口可以包括电子设备100的界面上正在显示的窗口,还可以包括电子设备100后台运行的一个或多个应用程序的窗口。
窗口管理器是一个服务(service),它是全局的、系统中唯一的、独立于应用程序的单独的C++服务,窗口管理器被所有应用程序公用。的窗口管理系统是基于客户端/服务端(client/service,C/S)模式的,整个窗口系统分为服务端(service)和客户端(client)两大部分。客户端即应用程序,负责请求创建窗口和使用窗口;服务端即窗口管理服务(window manager service或WindowManagerService,WMS),负责完成窗口的维护、窗口显示等。客户端并不是直接和窗口管理服务交互,而是直接和本地对象窗口管理(window manager或WindowManager)交互,然后由窗口管理(WindowManager)完成和窗口管理服务(WindowManagerService)的交互。对于应用来说这个交互是透明的,应用不能感知到窗口管理服务的存在。
窗口就是屏幕上的一块矩形区域,可以显示用户界面(user interface,UI)和与用户交互。在一些实施例中,窗口也可以将用户界面(即软件的操作界面)隐藏起来,在用户需要操作时再快速为用户展现 应用导航和功能操作,或者根据触发的指令再度展开应用的操作界面。从系统的角度看,窗口其实是一个画布(surface)。一个屏幕可以有多个窗口,而这多个窗口的布局和顺序以及窗口动画是由窗口管理服务WMS管理的,多个画布内容混合和显示则是由SurfaceFlinger服务实现的。窗口是分层的,层级大的会覆盖在层级小的窗口上面,可以用z层级表示窗口的堆叠关系,z层级大(z层级高)的窗口覆盖在z层级小(z层级低)的窗口上面。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
系统运行库层(libraries)可以分成两部分,分别是系统库和Android运行时。
安卓运行时(Android runtime)即Android运行环境,包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库是应用程序框架的支撑,可包括多个功能模块,例如:表面管理器(surface manager),媒体库(media libraries),二维图形引擎(例如SGL)、三维图形处理库(例如OpenGL ES)、图像处理库等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成和图层处理等。
二维图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层,用于提供操作系统的本质功能例如文件管理、内存管理、进程管理、网络协议栈等。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动、蓝牙驱动等。
由多个分级菜单构成的多级菜单由于能容纳较多的选项供用户选择,在终端设备上得到广泛应用。如图3所示,在窗口310上,用户可以进行右键点击操作,从而触发打开窗口312,该窗口312中包括多级菜单的第一级菜单,当用户在第一级菜单的选项3处点击或鼠标悬停时,会触发打开窗口314,该窗口314中包括多级菜单的第二级菜单。用户每触发下一级菜单,电子设备就需要创建新的窗口,从而容纳该下一级菜单,每次创建新的窗口都会消耗较多的系统资源,因此当菜单的级数较多时,显示多级菜单消耗的资源较多。
有鉴于此,本申请实施例提供了一种显示方法,该显示方法可以应用于电子设备,电子设备可以具有图1和图2所示的结构,该显示方法能够减小创建和显示多级菜单时消耗的资源,提高用户的体验感。如图4所示,该方法包括:
S410,接收用于展开多级菜单的第n级菜单的第一输入。
S420,响应于第一输入,在透明窗口中显示用户界面控件,该用户界面控件是多级菜单的第n级菜单。
在S410中,n为正整数,当n为1时,第一输入为触发打开多级菜单的用户输入(展开第一级菜单),例如第一输入可以为在第一窗口的空白区域鼠标右键点击、鼠标左键或右键点击窗口界面上的菜 单控件、在窗口上手势输入或者在窗口内长按(触屏输入)等,电子设备可以根据用户的第一输入的位置,预测多级菜单的显示区域。
当第一输入为对第一窗口内的菜单控件的点击时,如果菜单控件的位置是预设的,不允许用户移动,则由于菜单控件的位置是确定的,用户对菜单控件点击后多级菜单出现的区域(多级菜单的可能显示区域)也是确定的。如果用户可以调节菜单控件的位置,则调节后菜单控件的位置是确定的,用户对菜单控件点击后多级菜单出现的区域(多级菜单的可能显示区域)也是确定的。另外,第一窗口内的菜单控件的位置可以随着第一窗口的显示状况(全屏非全屏、非全屏的具体大小)变化而变化。
当用户点击该菜单控件时,电子设备可以确定该菜单控件距离显示屏显示界面各个边界的距离,并根据该距离确定多级菜单的每一级菜单的显示方式。以多级菜单为三级菜单为例,当显示屏显示界面的边界不影响多级菜单的展开时,为了提高用户的体验,可以设置默认展开方式为下一级菜单的第一个选项与当前级菜单的已选选项对齐,并且下一级菜单显示在当前级菜单的右侧,图5示出了本申请实施例提供的一种图形用户界面,如图5的(a)所示,当用户鼠标悬停在选项3或点击第一级菜单511的选项3时,第二级菜单512展开,且第二级菜单512的第一个选项3.1与触发第二级菜单展开的选项3平齐。当用户鼠标悬停或点击第二级菜单512的选项3.2时,第三级菜单513展开,第三级菜单513的第一个选项3.2.1与触发第三级菜单513展开的选项3.2平齐。
当显示屏显示界面的边界影响多级菜单的展开时,例如多级菜单向右展开受到显示屏显示界面的右边界影响时,可以将多级菜单的至少一级菜单向左展开,也就是说使得下一级菜单显示在上一级菜单的左侧。如图5的(b)所示,当第二级菜单512显示后,第二级菜单的右侧到显示屏的右边界距离较短,无法将第三级菜单显示在第二级菜单的右边,则可以将第三级菜单显示在第二级菜单的左侧。
当多级菜单的展开受到显示屏显示界面的下边界影响时,可以在下一级菜单展开时,使得当前菜单的已选选项与下一级菜单的最后一个选项对齐。如图5的(c)所示,当第一级菜单511展开后,当用户鼠标悬停在选项7的位置或者点击选项7时(图中选项7被第三级菜单513遮挡),由于显示屏显示界面右边界的影响,第二级菜单512显示在第一级菜单511的左侧,当用户鼠标悬停或者点击第二级菜单的选项7.2时,由于显示屏显示界面的下边界的影响,第三级菜单的第一个选项7.2.1无法与第二级菜单的已选选项7.2对齐,可以使得第三级菜单的最后一个选项7.10与第二级菜单的已选选项7.2对齐,从而多级菜单的每一级菜单都可以显示在显示屏的显示界面的范围内。
当多级菜单以不同方式展开时(选择不同选项带来的不同展开方式,例如第一级菜单的选项1、选项2、选项3、选项7都能触发展开第二级菜单,为第二级菜单的不同展开方式)可能采用上述不同的展开模式:例如,当前级菜单已选选项与下一级菜单的第一个选项或者最后一个选项对齐(也可以当前级菜单已选选项与下一级菜单的其他选项或下一级菜单的中部位置对齐),下一级菜单显示在当前级菜单的右侧或者左侧等。多级菜单是根据对窗口510上的菜单控件514的点击而触发的,在窗口510处于全屏模式、处于非全屏模式以及处于非全屏模式的不同窗口尺寸的情况下,该菜单控件514距离显示屏显示界面的边界的距离可以由电子设备确定,从而进一步确定多级菜单以不同展开方式下的展开模式以及不同展开方式的显示区域。
当多级菜单是根据用户对窗口510上空白区域(空白区域可以为窗口510上没有控件的区域)的点击而触发的情况下,用户在不同的位置点击时,多级菜单的第一级菜单会与该点击位置有对应关系,例如第一级菜单的左上角、左下角、右上角、右下角或者其他参考点可以与该点击位置重合。电子设备可以按照与前文介绍的类似的方式确定该点击位置与显示屏的显示界面的边界的距离,并进一步确定每一级菜单的显示模式,最终也可以确定多级菜单按照不同展开方式展开时的显示区域。
也就是说,当接收到用户的第一输入时,电子设备可以根据第一输入的位置预测多级菜单的可能显示区域。
当n大于或等于2时,由于第一输入是用于展开多级菜单的第二级菜单或以上级菜单的,第一级菜单在此之前已经被展开,仍然可以按照前文介绍的方式确定多级菜单的可能显示区域。
在S420中,电子设备可以响应于第一输入,在透明窗口中显示用户界面控件,该用户界面控件是多级菜单的第n级菜单。也就是说,当用户展开多级菜单的任一级菜单时,每一级菜单都是以透明窗口上的用户界面控件的形式显示的。这样,显示多级菜单时,电子设备的操作系统只要对多级菜单所属的透明窗口这一层进行渲染,能够减小显示多级菜单带来的功耗。
如果该第一输入是用于展开第一级菜单的,电子设备首先会根据该第一输入在第一窗口上方创建透 明窗口,该透明窗口的z层级最高(高于第一窗口)。在一些实施例中,由于电子设备可以根据触发第一级窗口的输入的位置预测多级菜单的显示区域,则可以在使得生成的透明窗口覆盖该多级菜单的可能显示区域的前提下,尽量减小透明窗口的大小。例如使得透明窗口区域等于或略大于多级菜单的可能显示区域,示意性的,可以使得透明窗口的边界相较于多级菜单的可能显示区域的边界外延若干个像素(最大外延到电子设备的屏幕显示区域的边界)。在这种情况下,创建的透明窗口的所占的区域与触发展开第一级菜单的输入的位置有关。在另一些实施例中,为了减小复杂性,电子设备可以直接将透明窗口的区域设置为与电子设备的屏幕显示区域一样大。图6示出了多级菜单的两种展开方式,其中展开方式1包括两级菜单,第一级菜单602以及由选项1进一步展开形成的第二级菜单604,包括选项1.1到选项1.5。展开方式2包括四级菜单,由第一级菜单602的选项7进一步展开形成第二级菜单606,第二级菜单606的选项7.2进一步展开形成第三级菜单608,第三级菜单的选项7.2.2进一步展开形成第四级菜单610。也就是说,透明窗口大小可以使得多级菜单按照不同方式展开时各级菜单都能够显示在透明窗口的范围内,也就是透明窗口所占区域大于或等于多级菜单的可能显示区域。
在生成透明窗口后,可以将多级菜单的第一级菜单以该透明窗口的用户界面控件的形式显示给用户。该透明窗口虽然不可见,但是通过透明窗口的用户界面控件形式生成的多级菜单是可见的。
多级菜单是根据用户的输入一级一级展开的。在用户对第一窗口进行输入用于展开第一级菜单后,生成的透明窗口的z层级最高(在多级菜单未全部被折叠的情况下),因此,当后续用户输入的位置在透明窗口的范围内时,该用户输入是直接作用于透明窗口的。在本申请实施例中,电子设备还会实时确定多级菜单的输入热区,输入热区是多级菜单已展开的各级菜单所占区域的集合。当电子设备接收到在透明窗口范围内的用户输入时,可以进一步确定该输入的位置,当输入的位置在输入热区内部时,确定该用户输入是作用于多级菜单的,从而根据该用输入确定是调用相应选项的功能或者展开下一级菜单或者折叠某一级或多级菜单。当输入的位置在输入热区外时,例如该输入是点击输入时,可以用于激活其他窗口,则电子设备可以对应激活透明窗口下的其他窗口(点击位置处透明窗口下z层级最高的窗口),并且关闭透明窗口以及透明窗口上的多级菜单,从而专用于多级菜单而生成的透明窗口不会给用户的体验带来不良的影响。另外,在一些操作系统中,该点击输入可以直接被传递给相应的窗口,由窗口对应的应用确定是否进一步执行其他指令,例如该点击位置能够触发相应的指令或者唤起相应的功能(该点击位置处的窗口上有控件等情形下)。
如图7所示,图7的(a)~图7的(c)依次示出了多级菜单中第一级菜单-第二级菜单-第三级菜单的展开的图形用户界面(graphicsuserinterface,GUI)的示意图。当用户点击图7的(a)中的控件711时,会触发生成透明窗口713和第一用户界面控件,第一用户界面控件为多级菜单的第一级菜单。此时,可以根据当前已经向用户显示的第一级菜单,确定输入热区为粗框712所示,如果用户之后的输入在输入热区范围内,可以确定该输入为针对该多级菜单的,可以相应将该用户输入传递给该多级菜单对应的第一窗口710的应用,由该应用确定用户输入是用于展开第二级菜单或者是用于调用相应选项对应的功能或者是用于折叠多级菜单的某一级菜单。
示例性的,如图7的(b)所示,用户的输入可以位于输入热区712的范围内,用户的输入可以为对第一级菜单中选项3的点击输入或者鼠标在选项3悬停时间超过预设阈值,该用户输入可以被传递给窗口710所属的应用,由应用确定该输入的目标为触发打开下一级菜单,则可以生成对应下一级菜单的用户界面控件,同时输入热区会随着第二级菜单的显示而更新为如粗框712’所示,是两级菜单所占区域的合并后的区域。
之后用户可以继续点击第二级菜单的选项3.2或者鼠标在选项3.2上悬停超过预设时间,触发展开第三级菜单,并且输入热区随之更新为如图7的(c)所示的712”。
当用户的输入位于输入热区外时,可以根据用户输入的位置确定该用户输入的类型进行具体的操作。如图7的(c)所示,例如用户输入为点击输入,当点击输入的位置在A点时,由于A点所在位置处透明窗口下的窗口为窗口710,该输入能够激活窗口710,在一些实施例中,电子设备可以将该输入传递给窗口710的应用;当点击输入的位置在B点时,由于B点所在位置透明窗口下的窗口为720,电子设备会根据该点击输入激活窗口720,在一些实施例中,该输入可以被传递给窗口720的应用;当点击输入在C点时,由于C点所在位置透明窗口下的窗口为窗口730,该用户输入可以激活窗口730,在一些实施例中,该输入会被传递给窗口730的应用进行进一步处理。同时,如果用户在输入热区以外进行点击,则多级菜单会被关闭,之前生成的透明窗口也会被关闭。也就是说,在输入热区外的用户输入为点击输入时, 可以直接用于关闭透明窗口以及多级菜单并且激活点击位置透明窗口下z层级最高的窗口,在一些实施例中,激活相应窗口后的输入才会被传递给该窗口,在另一些实施例中,该激活窗口的输入即可以被传递给相应窗口,由该窗口的应用进行进一步处理。
应理解,对于不同的窗口触发的多级菜单或者在同一窗口触发的不同多级菜单,电子设备可以为该多级菜单创建单独的透明窗口,不同的多级菜单对应的透明窗口可以具有不同的尺寸以及位置。但是实际上该多级菜单的展开、折叠以及具体指令的执行都是由触发多级菜单的窗口所在的应用确定的,该多级菜单虽然是透明窗口上的用户界面控件,但是实际上由触发该多级菜单的窗口所在的应用进行管理。
由于本申请实施例使得多级菜单以透明窗口上的用户界面控件的形式向用户显示,而透明窗口的大小可以根据菜单的显示需求而确定,因此,一方面,该多级菜单的显示不会受到触发或管理该多级菜单的第一窗口的边界的限制,当将透明窗口的尺寸设置得合适时,并且第一窗口为非全屏显示窗口时,部分透明窗口可以在第一窗口所占的区域的外部,则相应多级菜单可以显示在第一窗口的外部,当第一窗口显示的区域较小时,如果多级菜单显示在第一窗口内部,则会使得多级菜单的部分级重叠,用户无法直观的看到已展开的各级菜单,不便于用户的操作;另一方面,由于多级菜单都使用用户界面控件的形式显示,电子设备的操作系统对多级菜单进行渲染时只需要渲染透明窗口一个图层,与每一级菜单都使用单独的窗口相比(需要单独对每个窗口的图层进行渲染),降低了系统的开销与功耗,提高了用户的体验。
本申请实施例所介绍的技术方案可以应用于使用不同操作系统的电子设备中,例如可以应用于使用操作系统、操作系统、操作系统、操作系统、操作系统等,只要该操作系统需要使用多级菜单,都能够在降低多级菜单的功耗的情况下,使得多级菜单的显示不受到原窗口的范围的限制,提高用户的体验感。
图8示出了本申请实施例提供的一种显示装置800,该显示装置800包括处理单元810和显示单元820,该处理单元810可以用于执行图4所示的方法实施例的步骤410,该显示单元820可以用于执行图4所示的方法实施例的步骤S420。
具体的,该显示装置800包括:处理单元810,用于接收用于展开多级菜单的第n级菜单的第一输入;显示单元820,用于响应于该第一输入,在透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
在一些实施例中,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,所述处理单元还用于:根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
在一些实施例中,该处理单元810,还用于:确定该多级菜单的输入热区,该输入热区是该多级菜单已展开的各级菜单所占区域的集合;接收用户对该透明窗口的第二输入;当该第二输入位于该输入热区内部时,根据该第二输入,对该多级菜单执行与该第二输入对应的操作;或者,当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
在一些实施例中,该透明窗口所占区域大于或等于所述多级菜单的可能显示区域。
在一些实施例中,该多级菜单的可能显示区域与触发展开该多级菜单的第一级菜单的输入的位置有关。
在一些实施例中,该透明窗口所占区域与该电子设备的屏幕显示区域一致。
在一些实施例中,该第一窗口为非全屏显示窗口,该透明窗口的部分区域在该第一窗口所占的区域的外部。
在一些实施例中,该用户界面控件是根据对该透明窗口的渲染结果显示的。
在一些实施例中,该显示装置800的操作系统是Windows操作系统、Android操作系统、IOS操作系统或者Linux操作系统。
在一些实施例中,当n为1时,该第一输入是用户对该第一窗口内的第一控件的点击操作;或者,该第一输入是用户在该第一窗口内空白区域鼠标右键点击或者长按点击操作;当该n大于或等于2时,该第一输入是在该多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,该第一输入是对该多级菜单的第n-1级菜单上的选项的点击操作。
图9示出了本申请实施例提供的一种显示装置。图9所示的显示装置900可对应于前文描述的显示装置,具体地,显示装置900可以是图8中的显示装置具体的例子。显示装置900包括:处理器920。在本 申请的实施例中,处理器920用于实现相应的控制管理操作,例如,处理器920用于支持显示装置900执行前述实施例的方法或操作或功能。可选的,显示装置900还可以包括:存储器910和通信接口930;处理器920、通信接口930以及存储器910可以相互连接或者通过总线940相互连接。其中,通信接口930用于支持该显示装置与其他设备等进行通信,存储器910用于存储显示装置的程序代码和数据。处理器920调用存储器910中存储的代码或者数据实现相应的操作。该存储器910可以跟处理器耦合在一起,也可以不耦合在一起。本申请实施例中的耦合是显示装置、单元或模块之间的间接耦合或通信连接,可以是电性,机械或其它的形式,用于显示装置、单元或模块之间的信息交互。
其中,处理器920可以是中央处理器单元,通用处理器,数字信号处理器,专用集成电路,现场可编程门阵列或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理器和微处理器的组合等等。通信接口930可以是收发器、电路、总线、模块或其它类型的通信接口。总线940可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。所述总线可以分为地址总线、数据总线、控制总线等。为便于表示,图9中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (22)

  1. 一种显示方法,应用于电子设备,其特征在于,包括:
    接收用于展开多级菜单的第n级菜单的第一输入;
    响应于所述第一输入,在透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
  2. 如权利要求1所述的方法,其特征在于,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,在所述在透明窗口中显示用户界面控件之前,所述方法还包括:
    根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    确定所述多级菜单的输入热区,所述输入热区是所述多级菜单已展开的各级菜单所占区域的集合;
    接收用户对所述透明窗口的第二输入;
    当所述第二输入位于所述输入热区内部时,根据所述第二输入,对所述多级菜单执行与所述第二输入对应的操作;或者,
    当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
  4. 如权利要求1至3中任一项所述的方法,其特征在于,所述透明窗口所占区域大于或等于所述多级菜单的可能显示区域。
  5. 如权利要求4所述的方法,其特征在于,所述多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单的输入的位置有关。
  6. 如权利要求4或5所述的方法,其特征在于,所述透明窗口所占区域与所述电子设备的屏幕显示区域一致。
  7. 如权利要求2所述的方法,其特征在于,所述第一窗口为非全屏显示窗口,所述透明窗口的部分区域在所述第一窗口所占的区域的外部。
  8. 如权利要求1至7中任一项所述的方法,其特征在于,所述用户界面控件是根据对所述透明窗口的渲染结果显示的。
  9. 如权利要求1至8中任一项所述的方法,其特征在于,
    当所述n为1时,所述第一输入是用户对所述第一窗口内的第一控件的点击操作;或者,所述第一输入是用户在所述第一窗口内空白区域鼠标右键点击或者长按点击操作;
    当所述n大于或等于2时,所述第一输入是在所述多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,所述第一输入是对所述多级菜单的第n-1级菜单上的选项的点击操作。
  10. 一种显示装置,其特征在于,所述显示装置包括:
    一个或多个处理器;
    一个或多个存储器;
    所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:
    接收用于展开多级菜单的第n级菜单的第一输入;
    响应于所述第一输入,在透明窗口中显示用户界面控件,所述用户界面控件是所述多级菜单的第n级菜单。
  11. 如权利要求10所述的显示装置,其特征在于,所述第一输入用于展开所述多级菜单的第一级菜单,所述第一输入位于所述第一窗口内,当所述指令被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:
    根据所述第一输入,在所述第一窗口上方创建所述透明窗口。
  12. 如权利要求10或11所述的显示装置,其特征在于,当所述指令被所述一个或多个处理器执行时,使得所述显示装置执行以下步骤:
    确定所述多级菜单的输入热区,所述输入热区是所述多级菜单已展开的各级菜单所占区域的集合;
    接收用户对所述透明窗口的第二输入;
    当所述第二输入位于所述输入热区内部时,根据所述第二输入,对所述多级菜单执行与所述第二输入对应的操作;或者,
    当所述第二输入位于所述输入热区外部且所述第二输入为点击输入时,根据所述第二输入,关闭所述多级菜单和所述透明窗口,激活所述第二输入的位置处所述透明窗口下z层级最高的窗口。
  13. 如权利要求10至12中任一项所述的显示装置,其特征在于,所述透明窗口所占区域大于或等于所述多级菜单的可能显示区域。
  14. 如权利要求13所述的显示装置,其特征在于,所述多级菜单的可能显示区域与触发展开所述多级菜单的第一级菜单的输入的位置有关。
  15. 如权利要求13或14所述的显示装置,其特征在于,所述透明窗口所占区域与所述显示装置的屏幕显示区域一致。
  16. 如权利要求11所述的显示装置,其特征在于,所述第一窗口为非全屏显示窗口,所述透明窗口的部分区域在所述第一窗口所占的区域的外部。
  17. 如权利要求10至16中任一项所述的显示装置,其特征在于,所述用户界面控件是根据对所述透明窗口的渲染结果显示的。
  18. 如权利要求10至17中任一项所述的显示装置,其特征在于,
    当所述n为1时,所述第一输入是用户对所述第一窗口内的第一控件的点击操作;或者,所述第一输入是用户在所述第一窗口内空白区域鼠标右键点击或者长按点击操作;
    当所述n大于或等于2时,所述第一输入是在所述多级菜单的第n-1级菜单的选项上悬停超过预设时长的操作;或者,所述第一输入是对所述多级菜单的第n-1级菜单上的选项的点击操作。
  19. 一种电子设备,其特征在于,包括如权利要求10至18中任一项所述的显示装置。
  20. 一种计算机存储介质,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至9中任一项所述的显示方法。
  21. 一种电子设备,其特征在于,包括:
    存储器,用于存储计算机指令;
    处理器,用于执行所述存储器中存储的计算机指令,以使所述电子设备执行如权利要求1至9中任一项所述的方法。
  22. 一种芯片系统,其特征在于,包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得所述至少一个处理器执行如权利要求1至9中任一项所述的方法。
PCT/CN2023/143194 2023-01-12 2023-12-29 显示方法、显示装置和电子设备 WO2024149089A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310072062.2A CN118331464A (zh) 2023-01-12 2023-01-12 显示方法、显示装置和电子设备
CN202310072062.2 2023-01-12

Publications (1)

Publication Number Publication Date
WO2024149089A1 true WO2024149089A1 (zh) 2024-07-18

Family

ID=91772972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/143194 WO2024149089A1 (zh) 2023-01-12 2023-12-29 显示方法、显示装置和电子设备

Country Status (2)

Country Link
CN (1) CN118331464A (zh)
WO (1) WO2024149089A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101052108A (zh) * 2007-03-21 2007-10-10 林羽明 一种显示动画菜谱的装置和方法
CN101071361A (zh) * 2007-05-31 2007-11-14 腾讯科技(深圳)有限公司 一种自定义右键菜单的方法及系统
US20190310744A1 (en) * 2009-07-22 2019-10-10 Behr Process Corporation Color Selection, Coordination And Purchase System
CN114201097A (zh) * 2020-09-16 2022-03-18 华为技术有限公司 一种多应用程序之间的交互方法
CN114690893A (zh) * 2020-12-31 2022-07-01 华为技术有限公司 一种显示方法及电子设备
CN115145447A (zh) * 2021-03-30 2022-10-04 华为技术有限公司 一种窗口的显示方法以及相关装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101052108A (zh) * 2007-03-21 2007-10-10 林羽明 一种显示动画菜谱的装置和方法
CN101071361A (zh) * 2007-05-31 2007-11-14 腾讯科技(深圳)有限公司 一种自定义右键菜单的方法及系统
US20190310744A1 (en) * 2009-07-22 2019-10-10 Behr Process Corporation Color Selection, Coordination And Purchase System
CN114201097A (zh) * 2020-09-16 2022-03-18 华为技术有限公司 一种多应用程序之间的交互方法
CN114690893A (zh) * 2020-12-31 2022-07-01 华为技术有限公司 一种显示方法及电子设备
CN115145447A (zh) * 2021-03-30 2022-10-04 华为技术有限公司 一种窗口的显示方法以及相关装置

Also Published As

Publication number Publication date
CN118331464A (zh) 2024-07-12

Similar Documents

Publication Publication Date Title
WO2021227770A1 (zh) 应用窗口显示方法和电子设备
WO2021164313A1 (zh) 界面布局方法、装置及系统
WO2021057830A1 (zh) 一种信息处理方法及电子设备
WO2022100315A1 (zh) 应用界面的生成方法及相关装置
WO2021120914A1 (zh) 一种界面元素的显示方法及电子设备
WO2021159922A1 (zh) 卡片显示方法、电子设备及计算机可读存储介质
WO2020181988A1 (zh) 一种语音控制方法及电子设备
WO2021244443A1 (zh) 分屏显示方法、电子设备及计算机可读存储介质
WO2021104030A1 (zh) 一种分屏显示方法及电子设备
WO2021057868A1 (zh) 一种界面切换方法及电子设备
WO2021115194A1 (zh) 一种应用图标的显示方法及电子设备
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
WO2021196970A1 (zh) 一种创建应用快捷方式的方法、电子设备及系统
WO2022042285A1 (zh) 一种应用程序界面显示的方法及电子设备
WO2021110133A1 (zh) 一种控件的操作方法及电子设备
US20220357818A1 (en) Operation method and electronic device
WO2022052677A1 (zh) 界面显示方法及电子设备
WO2021244459A1 (zh) 一种输入方法及电子设备
WO2023005751A1 (zh) 渲染方法及电子设备
WO2022213831A1 (zh) 一种控件显示方法及相关设备
CN110865765A (zh) 终端及地图控制方法
WO2021052488A1 (zh) 一种信息处理方法及电子设备
WO2023030276A1 (zh) 一种显示方法、装置、设备及存储介质
WO2022228043A1 (zh) 显示方法、电子设备、存储介质和程序产品
WO2024149089A1 (zh) 显示方法、显示装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23915842

Country of ref document: EP

Kind code of ref document: A1